00:00:00.001 Started by upstream project "autotest-nightly-lts" build number 2428 00:00:00.001 originally caused by: 00:00:00.002 Started by upstream project "nightly-trigger" build number 3689 00:00:00.002 originally caused by: 00:00:00.002 Started by timer 00:00:00.002 Started by timer 00:00:00.015 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.021 The recommended git tool is: git 00:00:00.021 using credential 00000000-0000-0000-0000-000000000002 00:00:00.025 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.038 Fetching changes from the remote Git repository 00:00:00.044 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.057 Using shallow fetch with depth 1 00:00:00.057 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.057 > git --version # timeout=10 00:00:00.069 > git --version # 'git version 2.39.2' 00:00:00.069 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.079 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.079 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.650 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.660 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.671 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:02.671 > git config core.sparsecheckout # timeout=10 00:00:02.682 > git read-tree -mu HEAD # timeout=10 00:00:02.698 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:02.725 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:02.725 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:02.810 [Pipeline] Start of Pipeline 00:00:02.825 [Pipeline] library 00:00:02.827 Loading library shm_lib@master 00:00:02.828 Library shm_lib@master is cached. Copying from home. 00:00:02.847 [Pipeline] node 00:00:02.876 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.878 [Pipeline] { 00:00:02.889 [Pipeline] catchError 00:00:02.890 [Pipeline] { 00:00:02.903 [Pipeline] wrap 00:00:02.911 [Pipeline] { 00:00:02.920 [Pipeline] stage 00:00:02.922 [Pipeline] { (Prologue) 00:00:03.129 [Pipeline] sh 00:00:03.411 + logger -p user.info -t JENKINS-CI 00:00:03.425 [Pipeline] echo 00:00:03.426 Node: WFP20 00:00:03.435 [Pipeline] sh 00:00:03.740 [Pipeline] setCustomBuildProperty 00:00:03.753 [Pipeline] echo 00:00:03.755 Cleanup processes 00:00:03.761 [Pipeline] sh 00:00:04.048 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.048 1477123 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.061 [Pipeline] sh 00:00:04.343 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.343 ++ grep -v 'sudo pgrep' 00:00:04.343 ++ awk '{print $1}' 00:00:04.343 + sudo kill -9 00:00:04.343 + true 00:00:04.357 [Pipeline] cleanWs 00:00:04.367 [WS-CLEANUP] Deleting project workspace... 00:00:04.367 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.372 [WS-CLEANUP] done 00:00:04.376 [Pipeline] setCustomBuildProperty 00:00:04.388 [Pipeline] sh 00:00:04.673 + sudo git config --global --replace-all safe.directory '*' 00:00:04.747 [Pipeline] httpRequest 00:00:05.383 [Pipeline] echo 00:00:05.385 Sorcerer 10.211.164.20 is alive 00:00:05.392 [Pipeline] retry 00:00:05.393 [Pipeline] { 00:00:05.404 [Pipeline] httpRequest 00:00:05.408 HttpMethod: GET 00:00:05.408 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.409 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.412 Response Code: HTTP/1.1 200 OK 00:00:05.412 Success: Status code 200 is in the accepted range: 200,404 00:00:05.412 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.968 [Pipeline] } 00:00:05.986 [Pipeline] // retry 00:00:05.992 [Pipeline] sh 00:00:06.275 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.289 [Pipeline] httpRequest 00:00:06.866 [Pipeline] echo 00:00:06.868 Sorcerer 10.211.164.20 is alive 00:00:06.878 [Pipeline] retry 00:00:06.879 [Pipeline] { 00:00:06.890 [Pipeline] httpRequest 00:00:06.894 HttpMethod: GET 00:00:06.894 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:06.895 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:06.898 Response Code: HTTP/1.1 200 OK 00:00:06.898 Success: Status code 200 is in the accepted range: 200,404 00:00:06.899 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:28.662 [Pipeline] } 00:00:28.681 [Pipeline] // retry 00:00:28.689 [Pipeline] sh 00:00:28.980 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:31.537 [Pipeline] sh 00:00:31.822 + git -C spdk log --oneline -n5 00:00:31.822 c13c99a5e test: Various fixes for Fedora40 00:00:31.822 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:00:31.822 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:00:31.822 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:00:31.822 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:00:31.833 [Pipeline] } 00:00:31.847 [Pipeline] // stage 00:00:31.857 [Pipeline] stage 00:00:31.859 [Pipeline] { (Prepare) 00:00:31.877 [Pipeline] writeFile 00:00:31.893 [Pipeline] sh 00:00:32.177 + logger -p user.info -t JENKINS-CI 00:00:32.189 [Pipeline] sh 00:00:32.472 + logger -p user.info -t JENKINS-CI 00:00:32.486 [Pipeline] sh 00:00:32.774 + cat autorun-spdk.conf 00:00:32.774 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:32.774 SPDK_TEST_FUZZER_SHORT=1 00:00:32.774 SPDK_TEST_FUZZER=1 00:00:32.774 SPDK_RUN_UBSAN=1 00:00:32.781 RUN_NIGHTLY=1 00:00:32.786 [Pipeline] readFile 00:00:32.810 [Pipeline] withEnv 00:00:32.812 [Pipeline] { 00:00:32.824 [Pipeline] sh 00:00:33.110 + set -ex 00:00:33.110 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:33.110 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:33.110 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:33.110 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:33.110 ++ SPDK_TEST_FUZZER=1 00:00:33.110 ++ SPDK_RUN_UBSAN=1 00:00:33.110 ++ RUN_NIGHTLY=1 00:00:33.110 + case $SPDK_TEST_NVMF_NICS in 00:00:33.110 + DRIVERS= 00:00:33.110 + [[ -n '' ]] 00:00:33.110 + exit 0 00:00:33.119 [Pipeline] } 00:00:33.133 [Pipeline] // withEnv 00:00:33.139 [Pipeline] } 00:00:33.152 [Pipeline] // stage 00:00:33.162 [Pipeline] catchError 00:00:33.163 [Pipeline] { 00:00:33.177 [Pipeline] timeout 00:00:33.177 Timeout set to expire in 30 min 00:00:33.179 [Pipeline] { 00:00:33.193 [Pipeline] stage 00:00:33.195 [Pipeline] { (Tests) 00:00:33.210 [Pipeline] sh 00:00:33.495 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:33.495 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:33.495 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:33.495 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:33.495 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:33.495 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:33.495 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:33.495 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:33.495 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:33.495 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:33.495 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:00:33.495 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:33.495 + source /etc/os-release 00:00:33.495 ++ NAME='Fedora Linux' 00:00:33.495 ++ VERSION='39 (Cloud Edition)' 00:00:33.495 ++ ID=fedora 00:00:33.495 ++ VERSION_ID=39 00:00:33.495 ++ VERSION_CODENAME= 00:00:33.495 ++ PLATFORM_ID=platform:f39 00:00:33.495 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:00:33.495 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:33.495 ++ LOGO=fedora-logo-icon 00:00:33.495 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:00:33.495 ++ HOME_URL=https://fedoraproject.org/ 00:00:33.495 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:00:33.495 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:33.495 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:33.495 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:33.495 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:00:33.495 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:33.495 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:00:33.495 ++ SUPPORT_END=2024-11-12 00:00:33.495 ++ VARIANT='Cloud Edition' 00:00:33.495 ++ VARIANT_ID=cloud 00:00:33.495 + uname -a 00:00:33.495 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:00:33.495 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:36.030 Hugepages 00:00:36.030 node hugesize free / total 00:00:36.030 node0 1048576kB 0 / 0 00:00:36.030 node0 2048kB 0 / 0 00:00:36.030 node1 1048576kB 0 / 0 00:00:36.030 node1 2048kB 0 / 0 00:00:36.030 00:00:36.030 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:36.030 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:36.030 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:36.030 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:00:36.030 + rm -f /tmp/spdk-ld-path 00:00:36.030 + source autorun-spdk.conf 00:00:36.030 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:36.030 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:36.030 ++ SPDK_TEST_FUZZER=1 00:00:36.030 ++ SPDK_RUN_UBSAN=1 00:00:36.030 ++ RUN_NIGHTLY=1 00:00:36.030 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:36.030 + [[ -n '' ]] 00:00:36.030 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:36.030 + for M in /var/spdk/build-*-manifest.txt 00:00:36.030 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:00:36.030 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:36.030 + for M in /var/spdk/build-*-manifest.txt 00:00:36.030 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:36.030 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:36.030 + for M in /var/spdk/build-*-manifest.txt 00:00:36.030 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:36.030 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:36.030 ++ uname 00:00:36.030 + [[ Linux == \L\i\n\u\x ]] 00:00:36.030 + sudo dmesg -T 00:00:36.030 + sudo dmesg --clear 00:00:36.030 + dmesg_pid=1478011 00:00:36.030 + sudo dmesg -Tw 00:00:36.030 + [[ Fedora Linux == FreeBSD ]] 00:00:36.030 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:36.030 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:36.030 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:36.030 + [[ -x /usr/src/fio-static/fio ]] 00:00:36.030 + export FIO_BIN=/usr/src/fio-static/fio 00:00:36.030 + FIO_BIN=/usr/src/fio-static/fio 00:00:36.030 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:36.030 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:36.030 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:36.030 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:36.030 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:36.030 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:36.030 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:36.030 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:36.030 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:36.030 Test configuration: 00:00:36.030 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:36.030 SPDK_TEST_FUZZER_SHORT=1 00:00:36.030 SPDK_TEST_FUZZER=1 00:00:36.030 SPDK_RUN_UBSAN=1 00:00:36.030 RUN_NIGHTLY=1 15:16:21 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:00:36.030 15:16:21 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:36.030 15:16:21 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:36.030 15:16:21 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:36.030 15:16:21 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:36.030 15:16:21 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:36.030 15:16:21 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:36.030 15:16:21 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:36.030 15:16:21 -- paths/export.sh@5 -- $ export PATH 00:00:36.030 15:16:21 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:36.030 15:16:21 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:36.030 15:16:21 -- common/autobuild_common.sh@440 -- $ date +%s 00:00:36.030 15:16:21 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1733148981.XXXXXX 00:00:36.030 15:16:21 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1733148981.99HHHi 00:00:36.030 15:16:21 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:00:36.030 15:16:21 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:00:36.030 15:16:21 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:36.030 15:16:21 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:36.030 15:16:21 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:36.030 15:16:21 -- common/autobuild_common.sh@456 -- $ get_config_params 00:00:36.030 15:16:21 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:00:36.030 15:16:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.030 15:16:21 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:00:36.030 15:16:21 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:36.030 15:16:21 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:36.030 15:16:21 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:36.030 15:16:21 -- spdk/autobuild.sh@16 -- $ date -u 00:00:36.030 Mon Dec 2 02:16:21 PM UTC 2024 00:00:36.030 15:16:21 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:36.290 LTS-67-gc13c99a5e 00:00:36.290 15:16:21 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:36.290 15:16:21 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:36.290 15:16:21 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:36.290 15:16:21 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:36.290 15:16:21 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:36.290 15:16:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.290 ************************************ 00:00:36.290 START TEST ubsan 00:00:36.290 ************************************ 00:00:36.290 15:16:21 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:00:36.290 using ubsan 00:00:36.290 00:00:36.290 real 0m0.000s 00:00:36.290 user 0m0.000s 00:00:36.290 sys 0m0.000s 00:00:36.290 15:16:21 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:00:36.290 15:16:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.290 ************************************ 00:00:36.290 END TEST ubsan 00:00:36.290 ************************************ 00:00:36.290 15:16:21 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:36.290 15:16:21 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:36.290 15:16:21 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:36.290 15:16:21 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:36.290 15:16:21 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:36.290 15:16:21 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:36.290 15:16:21 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:00:36.290 15:16:21 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:36.290 15:16:21 -- common/autotest_common.sh@10 -- $ set +x 00:00:36.290 ************************************ 00:00:36.290 START TEST autobuild_llvm_precompile 00:00:36.290 ************************************ 00:00:36.290 15:16:21 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:00:36.290 15:16:21 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:36.290 15:16:21 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:00:36.290 Target: x86_64-redhat-linux-gnu 00:00:36.290 Thread model: posix 00:00:36.290 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:36.290 15:16:21 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:00:36.291 15:16:21 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:00:36.291 15:16:21 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:00:36.291 15:16:21 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:00:36.291 15:16:21 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:00:36.291 15:16:21 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:36.291 15:16:21 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:00:36.291 15:16:21 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:00:36.291 15:16:21 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:00:36.291 15:16:21 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:00:36.550 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:36.550 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:36.809 Using 'verbs' RDMA provider 00:00:52.332 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:04.545 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:04.803 Creating mk/config.mk...done. 00:01:04.803 Creating mk/cc.flags.mk...done. 00:01:04.803 Type 'make' to build. 00:01:04.803 00:01:04.803 real 0m28.567s 00:01:04.803 user 0m12.576s 00:01:04.803 sys 0m15.380s 00:01:04.803 15:16:50 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:04.803 15:16:50 -- common/autotest_common.sh@10 -- $ set +x 00:01:04.803 ************************************ 00:01:04.803 END TEST autobuild_llvm_precompile 00:01:04.803 ************************************ 00:01:04.803 15:16:50 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:04.803 15:16:50 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:04.803 15:16:50 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:04.803 15:16:50 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:04.803 15:16:50 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:05.062 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:05.062 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:05.321 Using 'verbs' RDMA provider 00:01:18.470 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:30.681 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:30.681 Creating mk/config.mk...done. 00:01:30.681 Creating mk/cc.flags.mk...done. 00:01:30.681 Type 'make' to build. 00:01:30.681 15:17:14 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:01:30.681 15:17:14 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:30.681 15:17:14 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:30.681 15:17:14 -- common/autotest_common.sh@10 -- $ set +x 00:01:30.681 ************************************ 00:01:30.681 START TEST make 00:01:30.681 ************************************ 00:01:30.681 15:17:14 -- common/autotest_common.sh@1114 -- $ make -j112 00:01:30.681 make[1]: Nothing to be done for 'all'. 00:01:30.940 The Meson build system 00:01:30.940 Version: 1.5.0 00:01:30.940 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:30.940 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:30.940 Build type: native build 00:01:30.940 Project name: libvfio-user 00:01:30.940 Project version: 0.0.1 00:01:30.940 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:01:30.940 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:01:30.940 Host machine cpu family: x86_64 00:01:30.940 Host machine cpu: x86_64 00:01:30.940 Run-time dependency threads found: YES 00:01:30.940 Library dl found: YES 00:01:30.940 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:30.940 Run-time dependency json-c found: YES 0.17 00:01:30.940 Run-time dependency cmocka found: YES 1.1.7 00:01:30.940 Program pytest-3 found: NO 00:01:30.940 Program flake8 found: NO 00:01:30.940 Program misspell-fixer found: NO 00:01:30.940 Program restructuredtext-lint found: NO 00:01:30.940 Program valgrind found: YES (/usr/bin/valgrind) 00:01:30.940 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:30.940 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:30.940 Compiler for C supports arguments -Wwrite-strings: YES 00:01:30.940 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:30.940 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:30.940 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:30.940 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:30.940 Build targets in project: 8 00:01:30.940 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:30.940 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:30.940 00:01:30.940 libvfio-user 0.0.1 00:01:30.940 00:01:30.940 User defined options 00:01:30.940 buildtype : debug 00:01:30.940 default_library: static 00:01:30.940 libdir : /usr/local/lib 00:01:30.940 00:01:30.940 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:31.199 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:31.457 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:31.457 [2/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:31.457 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:31.457 [4/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:31.457 [5/36] Compiling C object samples/null.p/null.c.o 00:01:31.457 [6/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:31.457 [7/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:31.457 [8/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:31.457 [9/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:31.457 [10/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:31.457 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:31.457 [12/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:31.457 [13/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:31.457 [14/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:31.457 [15/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:31.457 [16/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:31.457 [17/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:31.457 [18/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:31.457 [19/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:31.457 [20/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:31.457 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:31.457 [22/36] Compiling C object samples/server.p/server.c.o 00:01:31.457 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:31.457 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:31.457 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:31.457 [26/36] Compiling C object samples/client.p/client.c.o 00:01:31.457 [27/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:31.457 [28/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:31.457 [29/36] Linking static target lib/libvfio-user.a 00:01:31.457 [30/36] Linking target samples/client 00:01:31.457 [31/36] Linking target test/unit_tests 00:01:31.457 [32/36] Linking target samples/server 00:01:31.457 [33/36] Linking target samples/null 00:01:31.457 [34/36] Linking target samples/gpio-pci-idio-16 00:01:31.457 [35/36] Linking target samples/lspci 00:01:31.457 [36/36] Linking target samples/shadow_ioeventfd_server 00:01:31.457 INFO: autodetecting backend as ninja 00:01:31.457 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:31.716 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:31.974 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:31.974 ninja: no work to do. 00:01:37.243 The Meson build system 00:01:37.243 Version: 1.5.0 00:01:37.243 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:37.243 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:37.243 Build type: native build 00:01:37.243 Program cat found: YES (/usr/bin/cat) 00:01:37.243 Project name: DPDK 00:01:37.243 Project version: 23.11.0 00:01:37.243 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:01:37.243 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:01:37.243 Host machine cpu family: x86_64 00:01:37.243 Host machine cpu: x86_64 00:01:37.243 Message: ## Building in Developer Mode ## 00:01:37.243 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:37.243 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:37.243 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:37.243 Program python3 found: YES (/usr/bin/python3) 00:01:37.243 Program cat found: YES (/usr/bin/cat) 00:01:37.243 Compiler for C supports arguments -march=native: YES 00:01:37.243 Checking for size of "void *" : 8 00:01:37.243 Checking for size of "void *" : 8 (cached) 00:01:37.243 Library m found: YES 00:01:37.243 Library numa found: YES 00:01:37.243 Has header "numaif.h" : YES 00:01:37.243 Library fdt found: NO 00:01:37.243 Library execinfo found: NO 00:01:37.243 Has header "execinfo.h" : YES 00:01:37.243 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:37.243 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:37.243 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:37.243 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:37.243 Run-time dependency openssl found: YES 3.1.1 00:01:37.243 Run-time dependency libpcap found: YES 1.10.4 00:01:37.243 Has header "pcap.h" with dependency libpcap: YES 00:01:37.243 Compiler for C supports arguments -Wcast-qual: YES 00:01:37.243 Compiler for C supports arguments -Wdeprecated: YES 00:01:37.243 Compiler for C supports arguments -Wformat: YES 00:01:37.243 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:37.243 Compiler for C supports arguments -Wformat-security: YES 00:01:37.244 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:37.244 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:37.244 Compiler for C supports arguments -Wnested-externs: YES 00:01:37.244 Compiler for C supports arguments -Wold-style-definition: YES 00:01:37.244 Compiler for C supports arguments -Wpointer-arith: YES 00:01:37.244 Compiler for C supports arguments -Wsign-compare: YES 00:01:37.244 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:37.244 Compiler for C supports arguments -Wundef: YES 00:01:37.244 Compiler for C supports arguments -Wwrite-strings: YES 00:01:37.244 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:37.244 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:37.244 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:37.244 Program objdump found: YES (/usr/bin/objdump) 00:01:37.244 Compiler for C supports arguments -mavx512f: YES 00:01:37.244 Checking if "AVX512 checking" compiles: YES 00:01:37.244 Fetching value of define "__SSE4_2__" : 1 00:01:37.244 Fetching value of define "__AES__" : 1 00:01:37.244 Fetching value of define "__AVX__" : 1 00:01:37.244 Fetching value of define "__AVX2__" : 1 00:01:37.244 Fetching value of define "__AVX512BW__" : 1 00:01:37.244 Fetching value of define "__AVX512CD__" : 1 00:01:37.244 Fetching value of define "__AVX512DQ__" : 1 00:01:37.244 Fetching value of define "__AVX512F__" : 1 00:01:37.244 Fetching value of define "__AVX512VL__" : 1 00:01:37.244 Fetching value of define "__PCLMUL__" : 1 00:01:37.244 Fetching value of define "__RDRND__" : 1 00:01:37.244 Fetching value of define "__RDSEED__" : 1 00:01:37.244 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:37.244 Fetching value of define "__znver1__" : (undefined) 00:01:37.244 Fetching value of define "__znver2__" : (undefined) 00:01:37.244 Fetching value of define "__znver3__" : (undefined) 00:01:37.244 Fetching value of define "__znver4__" : (undefined) 00:01:37.244 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:37.244 Message: lib/log: Defining dependency "log" 00:01:37.244 Message: lib/kvargs: Defining dependency "kvargs" 00:01:37.244 Message: lib/telemetry: Defining dependency "telemetry" 00:01:37.244 Checking for function "getentropy" : NO 00:01:37.244 Message: lib/eal: Defining dependency "eal" 00:01:37.244 Message: lib/ring: Defining dependency "ring" 00:01:37.244 Message: lib/rcu: Defining dependency "rcu" 00:01:37.244 Message: lib/mempool: Defining dependency "mempool" 00:01:37.244 Message: lib/mbuf: Defining dependency "mbuf" 00:01:37.244 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:37.244 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:37.244 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:37.244 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:37.244 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:37.244 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:37.244 Compiler for C supports arguments -mpclmul: YES 00:01:37.244 Compiler for C supports arguments -maes: YES 00:01:37.244 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:37.244 Compiler for C supports arguments -mavx512bw: YES 00:01:37.244 Compiler for C supports arguments -mavx512dq: YES 00:01:37.244 Compiler for C supports arguments -mavx512vl: YES 00:01:37.244 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:37.244 Compiler for C supports arguments -mavx2: YES 00:01:37.244 Compiler for C supports arguments -mavx: YES 00:01:37.244 Message: lib/net: Defining dependency "net" 00:01:37.244 Message: lib/meter: Defining dependency "meter" 00:01:37.244 Message: lib/ethdev: Defining dependency "ethdev" 00:01:37.244 Message: lib/pci: Defining dependency "pci" 00:01:37.244 Message: lib/cmdline: Defining dependency "cmdline" 00:01:37.244 Message: lib/hash: Defining dependency "hash" 00:01:37.244 Message: lib/timer: Defining dependency "timer" 00:01:37.244 Message: lib/compressdev: Defining dependency "compressdev" 00:01:37.244 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:37.244 Message: lib/dmadev: Defining dependency "dmadev" 00:01:37.244 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:37.244 Message: lib/power: Defining dependency "power" 00:01:37.244 Message: lib/reorder: Defining dependency "reorder" 00:01:37.244 Message: lib/security: Defining dependency "security" 00:01:37.244 Has header "linux/userfaultfd.h" : YES 00:01:37.244 Has header "linux/vduse.h" : YES 00:01:37.244 Message: lib/vhost: Defining dependency "vhost" 00:01:37.244 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:37.244 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:37.244 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:37.244 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:37.244 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:37.244 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:37.244 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:37.244 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:37.244 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:37.244 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:37.244 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:37.244 Configuring doxy-api-html.conf using configuration 00:01:37.244 Configuring doxy-api-man.conf using configuration 00:01:37.244 Program mandb found: YES (/usr/bin/mandb) 00:01:37.244 Program sphinx-build found: NO 00:01:37.244 Configuring rte_build_config.h using configuration 00:01:37.244 Message: 00:01:37.244 ================= 00:01:37.244 Applications Enabled 00:01:37.244 ================= 00:01:37.244 00:01:37.244 apps: 00:01:37.244 00:01:37.244 00:01:37.244 Message: 00:01:37.244 ================= 00:01:37.244 Libraries Enabled 00:01:37.244 ================= 00:01:37.244 00:01:37.244 libs: 00:01:37.244 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:37.244 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:37.244 cryptodev, dmadev, power, reorder, security, vhost, 00:01:37.244 00:01:37.244 Message: 00:01:37.244 =============== 00:01:37.244 Drivers Enabled 00:01:37.244 =============== 00:01:37.244 00:01:37.244 common: 00:01:37.244 00:01:37.244 bus: 00:01:37.244 pci, vdev, 00:01:37.244 mempool: 00:01:37.244 ring, 00:01:37.244 dma: 00:01:37.244 00:01:37.244 net: 00:01:37.244 00:01:37.244 crypto: 00:01:37.244 00:01:37.244 compress: 00:01:37.244 00:01:37.244 vdpa: 00:01:37.244 00:01:37.244 00:01:37.244 Message: 00:01:37.244 ================= 00:01:37.244 Content Skipped 00:01:37.244 ================= 00:01:37.244 00:01:37.244 apps: 00:01:37.244 dumpcap: explicitly disabled via build config 00:01:37.244 graph: explicitly disabled via build config 00:01:37.244 pdump: explicitly disabled via build config 00:01:37.244 proc-info: explicitly disabled via build config 00:01:37.244 test-acl: explicitly disabled via build config 00:01:37.244 test-bbdev: explicitly disabled via build config 00:01:37.244 test-cmdline: explicitly disabled via build config 00:01:37.244 test-compress-perf: explicitly disabled via build config 00:01:37.244 test-crypto-perf: explicitly disabled via build config 00:01:37.244 test-dma-perf: explicitly disabled via build config 00:01:37.244 test-eventdev: explicitly disabled via build config 00:01:37.244 test-fib: explicitly disabled via build config 00:01:37.244 test-flow-perf: explicitly disabled via build config 00:01:37.244 test-gpudev: explicitly disabled via build config 00:01:37.244 test-mldev: explicitly disabled via build config 00:01:37.244 test-pipeline: explicitly disabled via build config 00:01:37.244 test-pmd: explicitly disabled via build config 00:01:37.244 test-regex: explicitly disabled via build config 00:01:37.244 test-sad: explicitly disabled via build config 00:01:37.244 test-security-perf: explicitly disabled via build config 00:01:37.244 00:01:37.244 libs: 00:01:37.244 metrics: explicitly disabled via build config 00:01:37.244 acl: explicitly disabled via build config 00:01:37.244 bbdev: explicitly disabled via build config 00:01:37.244 bitratestats: explicitly disabled via build config 00:01:37.244 bpf: explicitly disabled via build config 00:01:37.244 cfgfile: explicitly disabled via build config 00:01:37.244 distributor: explicitly disabled via build config 00:01:37.244 efd: explicitly disabled via build config 00:01:37.244 eventdev: explicitly disabled via build config 00:01:37.244 dispatcher: explicitly disabled via build config 00:01:37.244 gpudev: explicitly disabled via build config 00:01:37.244 gro: explicitly disabled via build config 00:01:37.244 gso: explicitly disabled via build config 00:01:37.244 ip_frag: explicitly disabled via build config 00:01:37.244 jobstats: explicitly disabled via build config 00:01:37.244 latencystats: explicitly disabled via build config 00:01:37.244 lpm: explicitly disabled via build config 00:01:37.244 member: explicitly disabled via build config 00:01:37.244 pcapng: explicitly disabled via build config 00:01:37.244 rawdev: explicitly disabled via build config 00:01:37.244 regexdev: explicitly disabled via build config 00:01:37.244 mldev: explicitly disabled via build config 00:01:37.244 rib: explicitly disabled via build config 00:01:37.244 sched: explicitly disabled via build config 00:01:37.244 stack: explicitly disabled via build config 00:01:37.244 ipsec: explicitly disabled via build config 00:01:37.244 pdcp: explicitly disabled via build config 00:01:37.244 fib: explicitly disabled via build config 00:01:37.244 port: explicitly disabled via build config 00:01:37.244 pdump: explicitly disabled via build config 00:01:37.244 table: explicitly disabled via build config 00:01:37.244 pipeline: explicitly disabled via build config 00:01:37.244 graph: explicitly disabled via build config 00:01:37.244 node: explicitly disabled via build config 00:01:37.244 00:01:37.244 drivers: 00:01:37.244 common/cpt: not in enabled drivers build config 00:01:37.244 common/dpaax: not in enabled drivers build config 00:01:37.244 common/iavf: not in enabled drivers build config 00:01:37.244 common/idpf: not in enabled drivers build config 00:01:37.244 common/mvep: not in enabled drivers build config 00:01:37.244 common/octeontx: not in enabled drivers build config 00:01:37.245 bus/auxiliary: not in enabled drivers build config 00:01:37.245 bus/cdx: not in enabled drivers build config 00:01:37.245 bus/dpaa: not in enabled drivers build config 00:01:37.245 bus/fslmc: not in enabled drivers build config 00:01:37.245 bus/ifpga: not in enabled drivers build config 00:01:37.245 bus/platform: not in enabled drivers build config 00:01:37.245 bus/vmbus: not in enabled drivers build config 00:01:37.245 common/cnxk: not in enabled drivers build config 00:01:37.245 common/mlx5: not in enabled drivers build config 00:01:37.245 common/nfp: not in enabled drivers build config 00:01:37.245 common/qat: not in enabled drivers build config 00:01:37.245 common/sfc_efx: not in enabled drivers build config 00:01:37.245 mempool/bucket: not in enabled drivers build config 00:01:37.245 mempool/cnxk: not in enabled drivers build config 00:01:37.245 mempool/dpaa: not in enabled drivers build config 00:01:37.245 mempool/dpaa2: not in enabled drivers build config 00:01:37.245 mempool/octeontx: not in enabled drivers build config 00:01:37.245 mempool/stack: not in enabled drivers build config 00:01:37.245 dma/cnxk: not in enabled drivers build config 00:01:37.245 dma/dpaa: not in enabled drivers build config 00:01:37.245 dma/dpaa2: not in enabled drivers build config 00:01:37.245 dma/hisilicon: not in enabled drivers build config 00:01:37.245 dma/idxd: not in enabled drivers build config 00:01:37.245 dma/ioat: not in enabled drivers build config 00:01:37.245 dma/skeleton: not in enabled drivers build config 00:01:37.245 net/af_packet: not in enabled drivers build config 00:01:37.245 net/af_xdp: not in enabled drivers build config 00:01:37.245 net/ark: not in enabled drivers build config 00:01:37.245 net/atlantic: not in enabled drivers build config 00:01:37.245 net/avp: not in enabled drivers build config 00:01:37.245 net/axgbe: not in enabled drivers build config 00:01:37.245 net/bnx2x: not in enabled drivers build config 00:01:37.245 net/bnxt: not in enabled drivers build config 00:01:37.245 net/bonding: not in enabled drivers build config 00:01:37.245 net/cnxk: not in enabled drivers build config 00:01:37.245 net/cpfl: not in enabled drivers build config 00:01:37.245 net/cxgbe: not in enabled drivers build config 00:01:37.245 net/dpaa: not in enabled drivers build config 00:01:37.245 net/dpaa2: not in enabled drivers build config 00:01:37.245 net/e1000: not in enabled drivers build config 00:01:37.245 net/ena: not in enabled drivers build config 00:01:37.245 net/enetc: not in enabled drivers build config 00:01:37.245 net/enetfec: not in enabled drivers build config 00:01:37.245 net/enic: not in enabled drivers build config 00:01:37.245 net/failsafe: not in enabled drivers build config 00:01:37.245 net/fm10k: not in enabled drivers build config 00:01:37.245 net/gve: not in enabled drivers build config 00:01:37.245 net/hinic: not in enabled drivers build config 00:01:37.245 net/hns3: not in enabled drivers build config 00:01:37.245 net/i40e: not in enabled drivers build config 00:01:37.245 net/iavf: not in enabled drivers build config 00:01:37.245 net/ice: not in enabled drivers build config 00:01:37.245 net/idpf: not in enabled drivers build config 00:01:37.245 net/igc: not in enabled drivers build config 00:01:37.245 net/ionic: not in enabled drivers build config 00:01:37.245 net/ipn3ke: not in enabled drivers build config 00:01:37.245 net/ixgbe: not in enabled drivers build config 00:01:37.245 net/mana: not in enabled drivers build config 00:01:37.245 net/memif: not in enabled drivers build config 00:01:37.245 net/mlx4: not in enabled drivers build config 00:01:37.245 net/mlx5: not in enabled drivers build config 00:01:37.245 net/mvneta: not in enabled drivers build config 00:01:37.245 net/mvpp2: not in enabled drivers build config 00:01:37.245 net/netvsc: not in enabled drivers build config 00:01:37.245 net/nfb: not in enabled drivers build config 00:01:37.245 net/nfp: not in enabled drivers build config 00:01:37.245 net/ngbe: not in enabled drivers build config 00:01:37.245 net/null: not in enabled drivers build config 00:01:37.245 net/octeontx: not in enabled drivers build config 00:01:37.245 net/octeon_ep: not in enabled drivers build config 00:01:37.245 net/pcap: not in enabled drivers build config 00:01:37.245 net/pfe: not in enabled drivers build config 00:01:37.245 net/qede: not in enabled drivers build config 00:01:37.245 net/ring: not in enabled drivers build config 00:01:37.245 net/sfc: not in enabled drivers build config 00:01:37.245 net/softnic: not in enabled drivers build config 00:01:37.245 net/tap: not in enabled drivers build config 00:01:37.245 net/thunderx: not in enabled drivers build config 00:01:37.245 net/txgbe: not in enabled drivers build config 00:01:37.245 net/vdev_netvsc: not in enabled drivers build config 00:01:37.245 net/vhost: not in enabled drivers build config 00:01:37.245 net/virtio: not in enabled drivers build config 00:01:37.245 net/vmxnet3: not in enabled drivers build config 00:01:37.245 raw/*: missing internal dependency, "rawdev" 00:01:37.245 crypto/armv8: not in enabled drivers build config 00:01:37.245 crypto/bcmfs: not in enabled drivers build config 00:01:37.245 crypto/caam_jr: not in enabled drivers build config 00:01:37.245 crypto/ccp: not in enabled drivers build config 00:01:37.245 crypto/cnxk: not in enabled drivers build config 00:01:37.245 crypto/dpaa_sec: not in enabled drivers build config 00:01:37.245 crypto/dpaa2_sec: not in enabled drivers build config 00:01:37.245 crypto/ipsec_mb: not in enabled drivers build config 00:01:37.245 crypto/mlx5: not in enabled drivers build config 00:01:37.245 crypto/mvsam: not in enabled drivers build config 00:01:37.245 crypto/nitrox: not in enabled drivers build config 00:01:37.245 crypto/null: not in enabled drivers build config 00:01:37.245 crypto/octeontx: not in enabled drivers build config 00:01:37.245 crypto/openssl: not in enabled drivers build config 00:01:37.245 crypto/scheduler: not in enabled drivers build config 00:01:37.245 crypto/uadk: not in enabled drivers build config 00:01:37.245 crypto/virtio: not in enabled drivers build config 00:01:37.245 compress/isal: not in enabled drivers build config 00:01:37.245 compress/mlx5: not in enabled drivers build config 00:01:37.245 compress/octeontx: not in enabled drivers build config 00:01:37.245 compress/zlib: not in enabled drivers build config 00:01:37.245 regex/*: missing internal dependency, "regexdev" 00:01:37.245 ml/*: missing internal dependency, "mldev" 00:01:37.245 vdpa/ifc: not in enabled drivers build config 00:01:37.245 vdpa/mlx5: not in enabled drivers build config 00:01:37.245 vdpa/nfp: not in enabled drivers build config 00:01:37.245 vdpa/sfc: not in enabled drivers build config 00:01:37.245 event/*: missing internal dependency, "eventdev" 00:01:37.245 baseband/*: missing internal dependency, "bbdev" 00:01:37.245 gpu/*: missing internal dependency, "gpudev" 00:01:37.245 00:01:37.245 00:01:37.245 Build targets in project: 85 00:01:37.245 00:01:37.245 DPDK 23.11.0 00:01:37.245 00:01:37.245 User defined options 00:01:37.245 buildtype : debug 00:01:37.245 default_library : static 00:01:37.245 libdir : lib 00:01:37.245 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:37.245 c_args : -fPIC -Werror 00:01:37.245 c_link_args : 00:01:37.245 cpu_instruction_set: native 00:01:37.245 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:01:37.245 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,pcapng,bbdev 00:01:37.245 enable_docs : false 00:01:37.245 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:37.245 enable_kmods : false 00:01:37.245 tests : false 00:01:37.245 00:01:37.245 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:37.504 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:37.771 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:37.771 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:37.771 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:37.771 [4/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:37.771 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:37.771 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:37.771 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:37.771 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:37.771 [9/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:37.771 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:37.771 [11/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:37.771 [12/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:37.771 [13/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:37.771 [14/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:37.771 [15/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:37.771 [16/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:37.771 [17/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:37.771 [18/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:37.771 [19/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:37.771 [20/265] Linking static target lib/librte_kvargs.a 00:01:37.771 [21/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:37.771 [22/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:37.771 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:37.771 [24/265] Linking static target lib/librte_log.a 00:01:37.771 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:37.771 [26/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:37.771 [27/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:37.771 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:37.771 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:37.771 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:37.771 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:37.771 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:37.771 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:37.771 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:37.771 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:37.771 [36/265] Linking static target lib/librte_pci.a 00:01:37.771 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:37.771 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:37.771 [39/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:37.771 [40/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:37.771 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:38.034 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.034 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.034 [44/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:38.293 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:38.293 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:38.293 [47/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:38.293 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:38.293 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:38.293 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:38.293 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:38.293 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:38.293 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:38.293 [54/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:38.293 [55/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:38.293 [56/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:38.293 [57/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:38.293 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:38.293 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:38.293 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:38.293 [61/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:38.293 [62/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:38.293 [63/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:38.293 [64/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:38.293 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:38.293 [66/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:38.293 [67/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:38.293 [68/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:38.293 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:38.293 [70/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:38.293 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:38.293 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:38.293 [73/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:38.293 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:38.293 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:38.293 [76/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:38.293 [77/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:38.293 [78/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:38.293 [79/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:38.293 [80/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:38.293 [81/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:38.293 [82/265] Linking static target lib/librte_telemetry.a 00:01:38.293 [83/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:38.293 [84/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:38.293 [85/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:38.293 [86/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:38.293 [87/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:38.293 [88/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:38.293 [89/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:38.293 [90/265] Linking static target lib/librte_meter.a 00:01:38.293 [91/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:38.294 [92/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:38.294 [93/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:38.294 [94/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:38.294 [95/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:38.294 [96/265] Linking static target lib/librte_ring.a 00:01:38.294 [97/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:38.294 [98/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:38.294 [99/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:38.294 [100/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:38.294 [101/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:38.294 [102/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:38.294 [103/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:38.294 [104/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:38.294 [105/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:38.294 [106/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:38.294 [107/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:38.294 [108/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:38.294 [109/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:38.294 [110/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:38.294 [111/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:38.294 [112/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:38.294 [113/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:38.294 [114/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:38.294 [115/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:38.294 [116/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:38.294 [117/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:38.294 [118/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:38.294 [119/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:38.294 [120/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:38.294 [121/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:38.294 [122/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:38.294 [123/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:38.294 [124/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:38.294 [125/265] Linking static target lib/librte_cmdline.a 00:01:38.294 [126/265] Linking static target lib/librte_eal.a 00:01:38.294 [127/265] Linking static target lib/librte_timer.a 00:01:38.294 [128/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.294 [129/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:38.294 [130/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:38.294 [131/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:38.294 [132/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:38.294 [133/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:38.294 [134/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:38.294 [135/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:38.294 [136/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:38.294 [137/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:38.294 [138/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:38.294 [139/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:38.294 [140/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:38.294 [141/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:38.294 [142/265] Linking static target lib/librte_dmadev.a 00:01:38.294 [143/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:38.294 [144/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:38.294 [145/265] Linking target lib/librte_log.so.24.0 00:01:38.294 [146/265] Linking static target lib/librte_net.a 00:01:38.294 [147/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:38.294 [148/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:38.294 [149/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:38.294 [150/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:38.294 [151/265] Linking static target lib/librte_rcu.a 00:01:38.294 [152/265] Linking static target lib/librte_compressdev.a 00:01:38.294 [153/265] Linking static target lib/librte_reorder.a 00:01:38.294 [154/265] Linking static target lib/librte_mempool.a 00:01:38.294 [155/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:38.554 [156/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:38.554 [157/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:38.554 [158/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:38.554 [159/265] Linking static target lib/librte_security.a 00:01:38.554 [160/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:38.554 [161/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:38.554 [162/265] Linking static target lib/librte_mbuf.a 00:01:38.554 [163/265] Linking static target lib/librte_power.a 00:01:38.554 [164/265] Linking static target lib/librte_hash.a 00:01:38.554 [165/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:38.554 [166/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:38.554 [167/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:38.554 [168/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:38.554 [169/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.554 [170/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:38.554 [171/265] Linking target lib/librte_kvargs.so.24.0 00:01:38.554 [172/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:38.554 [173/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:38.554 [174/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:38.554 [175/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:38.554 [176/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:38.554 [177/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:38.554 [178/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:38.554 [179/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.554 [180/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:38.554 [181/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:38.554 [182/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:38.554 [183/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:38.554 [184/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:38.554 [185/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:38.554 [186/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:38.813 [187/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:38.813 [188/265] Linking static target lib/librte_cryptodev.a 00:01:38.813 [189/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:38.813 [190/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.813 [191/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:38.813 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:38.813 [193/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:38.813 [194/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.813 [195/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.813 [196/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.813 [197/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:38.813 [198/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:38.813 [199/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:38.813 [200/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:38.813 [201/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.813 [202/265] Linking static target drivers/librte_bus_vdev.a 00:01:38.813 [203/265] Linking target lib/librte_telemetry.so.24.0 00:01:38.813 [204/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:38.813 [205/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:38.813 [206/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:38.813 [207/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.813 [208/265] Linking static target drivers/librte_mempool_ring.a 00:01:39.072 [209/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:39.073 [210/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:39.073 [211/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:39.073 [212/265] Linking static target lib/librte_ethdev.a 00:01:39.073 [213/265] Linking static target drivers/librte_bus_pci.a 00:01:39.073 [214/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:39.073 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:39.073 [216/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.073 [217/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.332 [218/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.332 [219/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.332 [220/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.332 [221/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.591 [222/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.591 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:39.591 [224/265] Linking static target lib/librte_vhost.a 00:01:39.591 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.850 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.231 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:41.800 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.461 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.000 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.260 [231/265] Linking target lib/librte_eal.so.24.0 00:01:51.260 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:51.260 [233/265] Linking target lib/librte_timer.so.24.0 00:01:51.260 [234/265] Linking target drivers/librte_bus_vdev.so.24.0 00:01:51.260 [235/265] Linking target lib/librte_meter.so.24.0 00:01:51.260 [236/265] Linking target lib/librte_ring.so.24.0 00:01:51.521 [237/265] Linking target lib/librte_pci.so.24.0 00:01:51.521 [238/265] Linking target lib/librte_dmadev.so.24.0 00:01:51.521 [239/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:51.521 [240/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:51.521 [241/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:51.521 [242/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:51.521 [243/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:51.521 [244/265] Linking target lib/librte_rcu.so.24.0 00:01:51.521 [245/265] Linking target lib/librte_mempool.so.24.0 00:01:51.521 [246/265] Linking target drivers/librte_bus_pci.so.24.0 00:01:51.781 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:51.781 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:51.781 [249/265] Linking target lib/librte_mbuf.so.24.0 00:01:51.781 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:01:51.781 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:52.041 [252/265] Linking target lib/librte_net.so.24.0 00:01:52.041 [253/265] Linking target lib/librte_compressdev.so.24.0 00:01:52.041 [254/265] Linking target lib/librte_reorder.so.24.0 00:01:52.041 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:01:52.041 [256/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:52.041 [257/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:52.041 [258/265] Linking target lib/librte_security.so.24.0 00:01:52.041 [259/265] Linking target lib/librte_hash.so.24.0 00:01:52.041 [260/265] Linking target lib/librte_ethdev.so.24.0 00:01:52.041 [261/265] Linking target lib/librte_cmdline.so.24.0 00:01:52.327 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:52.327 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:52.327 [264/265] Linking target lib/librte_power.so.24.0 00:01:52.327 [265/265] Linking target lib/librte_vhost.so.24.0 00:01:52.327 INFO: autodetecting backend as ninja 00:01:52.327 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:01:53.267 CC lib/ut/ut.o 00:01:53.267 CC lib/log/log.o 00:01:53.267 CC lib/log/log_flags.o 00:01:53.267 CC lib/log/log_deprecated.o 00:01:53.267 CC lib/ut_mock/mock.o 00:01:53.267 LIB libspdk_ut.a 00:01:53.527 LIB libspdk_log.a 00:01:53.528 LIB libspdk_ut_mock.a 00:01:53.787 CC lib/dma/dma.o 00:01:53.787 CC lib/util/base64.o 00:01:53.787 CC lib/util/cpuset.o 00:01:53.787 CC lib/util/crc16.o 00:01:53.787 CC lib/util/bit_array.o 00:01:53.787 CC lib/util/crc32.o 00:01:53.787 CC lib/util/crc32c.o 00:01:53.787 CC lib/util/crc32_ieee.o 00:01:53.787 CC lib/util/crc64.o 00:01:53.787 CC lib/util/dif.o 00:01:53.787 CC lib/util/fd.o 00:01:53.787 CC lib/util/file.o 00:01:53.787 CC lib/util/hexlify.o 00:01:53.787 CC lib/ioat/ioat.o 00:01:53.787 CC lib/util/iov.o 00:01:53.787 CC lib/util/math.o 00:01:53.787 CC lib/util/pipe.o 00:01:53.787 CC lib/util/string.o 00:01:53.787 CC lib/util/strerror_tls.o 00:01:53.787 CC lib/util/uuid.o 00:01:53.787 CC lib/util/fd_group.o 00:01:53.787 CC lib/util/xor.o 00:01:53.787 CXX lib/trace_parser/trace.o 00:01:53.787 CC lib/util/zipf.o 00:01:53.787 CC lib/vfio_user/host/vfio_user_pci.o 00:01:53.787 CC lib/vfio_user/host/vfio_user.o 00:01:53.787 LIB libspdk_dma.a 00:01:53.787 LIB libspdk_ioat.a 00:01:54.047 LIB libspdk_vfio_user.a 00:01:54.047 LIB libspdk_util.a 00:01:54.307 LIB libspdk_trace_parser.a 00:01:54.307 CC lib/rdma/rdma_verbs.o 00:01:54.307 CC lib/rdma/common.o 00:01:54.307 CC lib/env_dpdk/env.o 00:01:54.307 CC lib/conf/conf.o 00:01:54.307 CC lib/env_dpdk/memory.o 00:01:54.307 CC lib/env_dpdk/pci.o 00:01:54.307 CC lib/env_dpdk/init.o 00:01:54.307 CC lib/env_dpdk/threads.o 00:01:54.307 CC lib/env_dpdk/pci_ioat.o 00:01:54.307 CC lib/env_dpdk/pci_virtio.o 00:01:54.307 CC lib/env_dpdk/pci_vmd.o 00:01:54.307 CC lib/json/json_parse.o 00:01:54.307 CC lib/env_dpdk/pci_idxd.o 00:01:54.307 CC lib/env_dpdk/pci_event.o 00:01:54.307 CC lib/json/json_util.o 00:01:54.307 CC lib/env_dpdk/sigbus_handler.o 00:01:54.307 CC lib/json/json_write.o 00:01:54.307 CC lib/env_dpdk/pci_dpdk.o 00:01:54.307 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:54.307 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:54.307 CC lib/vmd/led.o 00:01:54.307 CC lib/vmd/vmd.o 00:01:54.307 CC lib/idxd/idxd_kernel.o 00:01:54.307 CC lib/idxd/idxd.o 00:01:54.307 CC lib/idxd/idxd_user.o 00:01:54.566 LIB libspdk_conf.a 00:01:54.566 LIB libspdk_rdma.a 00:01:54.566 LIB libspdk_json.a 00:01:54.566 LIB libspdk_idxd.a 00:01:54.566 LIB libspdk_vmd.a 00:01:54.826 CC lib/jsonrpc/jsonrpc_server.o 00:01:54.826 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:54.826 CC lib/jsonrpc/jsonrpc_client.o 00:01:54.826 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:54.826 LIB libspdk_jsonrpc.a 00:01:55.393 LIB libspdk_env_dpdk.a 00:01:55.393 CC lib/rpc/rpc.o 00:01:55.393 LIB libspdk_rpc.a 00:01:55.652 CC lib/sock/sock_rpc.o 00:01:55.652 CC lib/sock/sock.o 00:01:55.652 CC lib/trace/trace_flags.o 00:01:55.652 CC lib/trace/trace.o 00:01:55.652 CC lib/trace/trace_rpc.o 00:01:55.652 CC lib/notify/notify.o 00:01:55.652 CC lib/notify/notify_rpc.o 00:01:55.912 LIB libspdk_notify.a 00:01:55.912 LIB libspdk_trace.a 00:01:55.912 LIB libspdk_sock.a 00:01:56.172 CC lib/thread/thread.o 00:01:56.172 CC lib/thread/iobuf.o 00:01:56.172 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:56.172 CC lib/nvme/nvme_ctrlr.o 00:01:56.172 CC lib/nvme/nvme_fabric.o 00:01:56.172 CC lib/nvme/nvme_ns_cmd.o 00:01:56.172 CC lib/nvme/nvme_pcie.o 00:01:56.172 CC lib/nvme/nvme_ns.o 00:01:56.172 CC lib/nvme/nvme_pcie_common.o 00:01:56.172 CC lib/nvme/nvme.o 00:01:56.172 CC lib/nvme/nvme_quirks.o 00:01:56.172 CC lib/nvme/nvme_transport.o 00:01:56.172 CC lib/nvme/nvme_qpair.o 00:01:56.172 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:56.172 CC lib/nvme/nvme_discovery.o 00:01:56.172 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:56.172 CC lib/nvme/nvme_tcp.o 00:01:56.172 CC lib/nvme/nvme_opal.o 00:01:56.172 CC lib/nvme/nvme_io_msg.o 00:01:56.172 CC lib/nvme/nvme_poll_group.o 00:01:56.172 CC lib/nvme/nvme_zns.o 00:01:56.172 CC lib/nvme/nvme_cuse.o 00:01:56.172 CC lib/nvme/nvme_vfio_user.o 00:01:56.172 CC lib/nvme/nvme_rdma.o 00:01:57.110 LIB libspdk_thread.a 00:01:57.110 CC lib/blob/blobstore.o 00:01:57.110 CC lib/blob/zeroes.o 00:01:57.110 CC lib/blob/request.o 00:01:57.110 CC lib/blob/blob_bs_dev.o 00:01:57.110 CC lib/virtio/virtio.o 00:01:57.110 CC lib/virtio/virtio_vhost_user.o 00:01:57.110 CC lib/virtio/virtio_vfio_user.o 00:01:57.110 CC lib/virtio/virtio_pci.o 00:01:57.110 CC lib/init/subsystem.o 00:01:57.110 CC lib/init/json_config.o 00:01:57.110 CC lib/init/subsystem_rpc.o 00:01:57.110 CC lib/init/rpc.o 00:01:57.110 CC lib/accel/accel.o 00:01:57.110 CC lib/accel/accel_rpc.o 00:01:57.110 CC lib/accel/accel_sw.o 00:01:57.110 CC lib/vfu_tgt/tgt_endpoint.o 00:01:57.110 CC lib/vfu_tgt/tgt_rpc.o 00:01:57.369 LIB libspdk_init.a 00:01:57.369 LIB libspdk_virtio.a 00:01:57.369 LIB libspdk_vfu_tgt.a 00:01:57.369 LIB libspdk_nvme.a 00:01:57.627 CC lib/event/app.o 00:01:57.627 CC lib/event/reactor.o 00:01:57.627 CC lib/event/log_rpc.o 00:01:57.627 CC lib/event/app_rpc.o 00:01:57.627 CC lib/event/scheduler_static.o 00:01:57.886 LIB libspdk_accel.a 00:01:57.886 LIB libspdk_event.a 00:01:58.144 CC lib/bdev/bdev.o 00:01:58.144 CC lib/bdev/part.o 00:01:58.144 CC lib/bdev/bdev_rpc.o 00:01:58.144 CC lib/bdev/bdev_zone.o 00:01:58.144 CC lib/bdev/scsi_nvme.o 00:01:58.712 LIB libspdk_blob.a 00:01:58.972 CC lib/lvol/lvol.o 00:01:58.972 CC lib/blobfs/blobfs.o 00:01:58.972 CC lib/blobfs/tree.o 00:01:59.540 LIB libspdk_lvol.a 00:01:59.540 LIB libspdk_blobfs.a 00:01:59.797 LIB libspdk_bdev.a 00:02:00.054 CC lib/nbd/nbd.o 00:02:00.054 CC lib/nbd/nbd_rpc.o 00:02:00.054 CC lib/ublk/ublk_rpc.o 00:02:00.054 CC lib/ublk/ublk.o 00:02:00.054 CC lib/scsi/dev.o 00:02:00.054 CC lib/ftl/ftl_core.o 00:02:00.054 CC lib/scsi/lun.o 00:02:00.054 CC lib/ftl/ftl_init.o 00:02:00.054 CC lib/scsi/port.o 00:02:00.054 CC lib/scsi/scsi.o 00:02:00.054 CC lib/ftl/ftl_layout.o 00:02:00.054 CC lib/ftl/ftl_debug.o 00:02:00.054 CC lib/scsi/scsi_bdev.o 00:02:00.054 CC lib/ftl/ftl_io.o 00:02:00.054 CC lib/scsi/scsi_pr.o 00:02:00.054 CC lib/ftl/ftl_sb.o 00:02:00.054 CC lib/scsi/scsi_rpc.o 00:02:00.054 CC lib/ftl/ftl_l2p.o 00:02:00.054 CC lib/nvmf/ctrlr.o 00:02:00.054 CC lib/scsi/task.o 00:02:00.054 CC lib/ftl/ftl_l2p_flat.o 00:02:00.054 CC lib/nvmf/ctrlr_discovery.o 00:02:00.054 CC lib/nvmf/ctrlr_bdev.o 00:02:00.054 CC lib/ftl/ftl_nv_cache.o 00:02:00.054 CC lib/nvmf/nvmf_rpc.o 00:02:00.054 CC lib/ftl/ftl_band.o 00:02:00.054 CC lib/nvmf/subsystem.o 00:02:00.054 CC lib/nvmf/nvmf.o 00:02:00.054 CC lib/ftl/ftl_band_ops.o 00:02:00.054 CC lib/ftl/ftl_writer.o 00:02:00.054 CC lib/nvmf/transport.o 00:02:00.054 CC lib/ftl/ftl_rq.o 00:02:00.054 CC lib/nvmf/tcp.o 00:02:00.054 CC lib/ftl/ftl_reloc.o 00:02:00.054 CC lib/nvmf/vfio_user.o 00:02:00.054 CC lib/nvmf/rdma.o 00:02:00.054 CC lib/ftl/ftl_l2p_cache.o 00:02:00.054 CC lib/ftl/ftl_p2l.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:00.054 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:00.054 CC lib/ftl/utils/ftl_md.o 00:02:00.054 CC lib/ftl/utils/ftl_conf.o 00:02:00.054 CC lib/ftl/utils/ftl_mempool.o 00:02:00.054 CC lib/ftl/utils/ftl_bitmap.o 00:02:00.054 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:00.054 CC lib/ftl/utils/ftl_property.o 00:02:00.054 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:00.054 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:00.054 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:00.054 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:00.054 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:00.054 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:00.054 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:00.054 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:00.054 CC lib/ftl/base/ftl_base_dev.o 00:02:00.054 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:00.054 CC lib/ftl/base/ftl_base_bdev.o 00:02:00.054 CC lib/ftl/ftl_trace.o 00:02:00.311 LIB libspdk_scsi.a 00:02:00.311 LIB libspdk_nbd.a 00:02:00.569 LIB libspdk_ublk.a 00:02:00.569 LIB libspdk_ftl.a 00:02:00.827 CC lib/iscsi/init_grp.o 00:02:00.827 CC lib/iscsi/iscsi.o 00:02:00.827 CC lib/iscsi/conn.o 00:02:00.827 CC lib/iscsi/md5.o 00:02:00.827 CC lib/vhost/vhost.o 00:02:00.827 CC lib/iscsi/param.o 00:02:00.827 CC lib/iscsi/iscsi_subsystem.o 00:02:00.827 CC lib/vhost/vhost_rpc.o 00:02:00.827 CC lib/iscsi/portal_grp.o 00:02:00.827 CC lib/vhost/vhost_scsi.o 00:02:00.827 CC lib/iscsi/tgt_node.o 00:02:00.827 CC lib/vhost/vhost_blk.o 00:02:00.827 CC lib/iscsi/iscsi_rpc.o 00:02:00.827 CC lib/vhost/rte_vhost_user.o 00:02:00.827 CC lib/iscsi/task.o 00:02:01.085 LIB libspdk_nvmf.a 00:02:01.343 LIB libspdk_vhost.a 00:02:01.601 LIB libspdk_iscsi.a 00:02:01.859 CC module/vfu_device/vfu_virtio.o 00:02:01.859 CC module/vfu_device/vfu_virtio_scsi.o 00:02:01.859 CC module/env_dpdk/env_dpdk_rpc.o 00:02:01.859 CC module/vfu_device/vfu_virtio_blk.o 00:02:01.859 CC module/vfu_device/vfu_virtio_rpc.o 00:02:02.118 CC module/sock/posix/posix.o 00:02:02.118 LIB libspdk_env_dpdk_rpc.a 00:02:02.118 CC module/accel/dsa/accel_dsa.o 00:02:02.118 CC module/accel/dsa/accel_dsa_rpc.o 00:02:02.118 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:02.118 CC module/scheduler/gscheduler/gscheduler.o 00:02:02.118 CC module/accel/iaa/accel_iaa_rpc.o 00:02:02.118 CC module/accel/iaa/accel_iaa.o 00:02:02.118 CC module/accel/ioat/accel_ioat.o 00:02:02.118 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:02.118 CC module/blob/bdev/blob_bdev.o 00:02:02.118 CC module/accel/ioat/accel_ioat_rpc.o 00:02:02.118 CC module/accel/error/accel_error.o 00:02:02.118 CC module/accel/error/accel_error_rpc.o 00:02:02.118 LIB libspdk_scheduler_gscheduler.a 00:02:02.118 LIB libspdk_scheduler_dpdk_governor.a 00:02:02.118 LIB libspdk_accel_iaa.a 00:02:02.118 LIB libspdk_scheduler_dynamic.a 00:02:02.118 LIB libspdk_accel_ioat.a 00:02:02.118 LIB libspdk_accel_error.a 00:02:02.118 LIB libspdk_accel_dsa.a 00:02:02.118 LIB libspdk_blob_bdev.a 00:02:02.377 LIB libspdk_vfu_device.a 00:02:02.377 LIB libspdk_sock_posix.a 00:02:02.636 CC module/bdev/raid/bdev_raid_rpc.o 00:02:02.636 CC module/bdev/raid/bdev_raid.o 00:02:02.636 CC module/bdev/raid/raid0.o 00:02:02.636 CC module/bdev/raid/bdev_raid_sb.o 00:02:02.636 CC module/bdev/raid/concat.o 00:02:02.636 CC module/bdev/nvme/bdev_nvme.o 00:02:02.636 CC module/bdev/raid/raid1.o 00:02:02.636 CC module/bdev/passthru/vbdev_passthru.o 00:02:02.636 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:02.636 CC module/bdev/lvol/vbdev_lvol.o 00:02:02.636 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:02.636 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:02.636 CC module/bdev/nvme/nvme_rpc.o 00:02:02.636 CC module/bdev/nvme/bdev_mdns_client.o 00:02:02.636 CC module/bdev/nvme/vbdev_opal.o 00:02:02.636 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:02.636 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:02.636 CC module/bdev/gpt/gpt.o 00:02:02.636 CC module/bdev/gpt/vbdev_gpt.o 00:02:02.636 CC module/blobfs/bdev/blobfs_bdev.o 00:02:02.636 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:02.636 CC module/bdev/error/vbdev_error.o 00:02:02.636 CC module/bdev/error/vbdev_error_rpc.o 00:02:02.636 CC module/bdev/iscsi/bdev_iscsi.o 00:02:02.636 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:02.636 CC module/bdev/split/vbdev_split.o 00:02:02.636 CC module/bdev/ftl/bdev_ftl.o 00:02:02.636 CC module/bdev/split/vbdev_split_rpc.o 00:02:02.636 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:02.636 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:02.636 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:02.636 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:02.636 CC module/bdev/aio/bdev_aio.o 00:02:02.636 CC module/bdev/aio/bdev_aio_rpc.o 00:02:02.636 CC module/bdev/delay/vbdev_delay.o 00:02:02.636 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:02.636 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:02.636 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:02.636 CC module/bdev/malloc/bdev_malloc.o 00:02:02.636 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:02.636 CC module/bdev/null/bdev_null.o 00:02:02.636 CC module/bdev/null/bdev_null_rpc.o 00:02:02.894 LIB libspdk_blobfs_bdev.a 00:02:02.894 LIB libspdk_bdev_split.a 00:02:02.894 LIB libspdk_bdev_gpt.a 00:02:02.894 LIB libspdk_bdev_error.a 00:02:02.894 LIB libspdk_bdev_passthru.a 00:02:02.894 LIB libspdk_bdev_ftl.a 00:02:02.894 LIB libspdk_bdev_null.a 00:02:02.894 LIB libspdk_bdev_aio.a 00:02:02.894 LIB libspdk_bdev_iscsi.a 00:02:02.894 LIB libspdk_bdev_zone_block.a 00:02:02.894 LIB libspdk_bdev_delay.a 00:02:02.894 LIB libspdk_bdev_malloc.a 00:02:02.894 LIB libspdk_bdev_lvol.a 00:02:03.154 LIB libspdk_bdev_virtio.a 00:02:03.154 LIB libspdk_bdev_raid.a 00:02:03.724 LIB libspdk_bdev_nvme.a 00:02:04.293 CC module/event/subsystems/sock/sock.o 00:02:04.293 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:04.293 CC module/event/subsystems/vmd/vmd.o 00:02:04.293 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:04.293 CC module/event/subsystems/iobuf/iobuf.o 00:02:04.293 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:04.293 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:04.293 CC module/event/subsystems/scheduler/scheduler.o 00:02:04.553 LIB libspdk_event_sock.a 00:02:04.553 LIB libspdk_event_vhost_blk.a 00:02:04.553 LIB libspdk_event_vmd.a 00:02:04.553 LIB libspdk_event_vfu_tgt.a 00:02:04.553 LIB libspdk_event_scheduler.a 00:02:04.553 LIB libspdk_event_iobuf.a 00:02:04.813 CC module/event/subsystems/accel/accel.o 00:02:04.813 LIB libspdk_event_accel.a 00:02:05.394 CC module/event/subsystems/bdev/bdev.o 00:02:05.394 LIB libspdk_event_bdev.a 00:02:05.652 CC module/event/subsystems/scsi/scsi.o 00:02:05.652 CC module/event/subsystems/nbd/nbd.o 00:02:05.652 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:05.652 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:05.652 CC module/event/subsystems/ublk/ublk.o 00:02:05.652 LIB libspdk_event_scsi.a 00:02:05.913 LIB libspdk_event_nbd.a 00:02:05.913 LIB libspdk_event_ublk.a 00:02:05.913 LIB libspdk_event_nvmf.a 00:02:06.174 CC module/event/subsystems/iscsi/iscsi.o 00:02:06.174 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:06.174 LIB libspdk_event_iscsi.a 00:02:06.174 LIB libspdk_event_vhost_scsi.a 00:02:06.433 TEST_HEADER include/spdk/accel.h 00:02:06.433 TEST_HEADER include/spdk/barrier.h 00:02:06.433 TEST_HEADER include/spdk/accel_module.h 00:02:06.433 TEST_HEADER include/spdk/assert.h 00:02:06.433 TEST_HEADER include/spdk/base64.h 00:02:06.433 TEST_HEADER include/spdk/bdev_module.h 00:02:06.434 TEST_HEADER include/spdk/bdev_zone.h 00:02:06.434 TEST_HEADER include/spdk/bdev.h 00:02:06.434 TEST_HEADER include/spdk/bit_pool.h 00:02:06.434 TEST_HEADER include/spdk/bit_array.h 00:02:06.434 TEST_HEADER include/spdk/blob_bdev.h 00:02:06.434 TEST_HEADER include/spdk/blobfs.h 00:02:06.434 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:06.434 TEST_HEADER include/spdk/blob.h 00:02:06.434 CC app/trace_record/trace_record.o 00:02:06.434 TEST_HEADER include/spdk/conf.h 00:02:06.434 TEST_HEADER include/spdk/config.h 00:02:06.434 TEST_HEADER include/spdk/cpuset.h 00:02:06.434 CC app/spdk_lspci/spdk_lspci.o 00:02:06.434 TEST_HEADER include/spdk/crc16.h 00:02:06.434 CC app/spdk_nvme_identify/identify.o 00:02:06.434 TEST_HEADER include/spdk/crc64.h 00:02:06.434 TEST_HEADER include/spdk/crc32.h 00:02:06.434 CC app/spdk_top/spdk_top.o 00:02:06.434 TEST_HEADER include/spdk/dif.h 00:02:06.434 TEST_HEADER include/spdk/endian.h 00:02:06.434 TEST_HEADER include/spdk/dma.h 00:02:06.434 TEST_HEADER include/spdk/env_dpdk.h 00:02:06.434 TEST_HEADER include/spdk/env.h 00:02:06.434 TEST_HEADER include/spdk/event.h 00:02:06.434 TEST_HEADER include/spdk/fd_group.h 00:02:06.434 CC app/spdk_nvme_discover/discovery_aer.o 00:02:06.434 TEST_HEADER include/spdk/fd.h 00:02:06.434 TEST_HEADER include/spdk/file.h 00:02:06.434 TEST_HEADER include/spdk/ftl.h 00:02:06.434 CC test/rpc_client/rpc_client_test.o 00:02:06.434 TEST_HEADER include/spdk/gpt_spec.h 00:02:06.434 CC app/spdk_nvme_perf/perf.o 00:02:06.434 TEST_HEADER include/spdk/hexlify.h 00:02:06.434 TEST_HEADER include/spdk/histogram_data.h 00:02:06.434 TEST_HEADER include/spdk/idxd.h 00:02:06.434 TEST_HEADER include/spdk/idxd_spec.h 00:02:06.434 TEST_HEADER include/spdk/init.h 00:02:06.434 TEST_HEADER include/spdk/ioat.h 00:02:06.434 TEST_HEADER include/spdk/ioat_spec.h 00:02:06.434 TEST_HEADER include/spdk/iscsi_spec.h 00:02:06.434 TEST_HEADER include/spdk/json.h 00:02:06.434 TEST_HEADER include/spdk/jsonrpc.h 00:02:06.434 TEST_HEADER include/spdk/likely.h 00:02:06.434 TEST_HEADER include/spdk/log.h 00:02:06.434 TEST_HEADER include/spdk/lvol.h 00:02:06.434 TEST_HEADER include/spdk/memory.h 00:02:06.434 TEST_HEADER include/spdk/mmio.h 00:02:06.434 TEST_HEADER include/spdk/notify.h 00:02:06.434 TEST_HEADER include/spdk/nvme.h 00:02:06.434 TEST_HEADER include/spdk/nbd.h 00:02:06.434 TEST_HEADER include/spdk/nvme_intel.h 00:02:06.434 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:06.434 TEST_HEADER include/spdk/nvme_spec.h 00:02:06.434 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:06.434 TEST_HEADER include/spdk/nvme_zns.h 00:02:06.434 CXX app/trace/trace.o 00:02:06.434 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:06.434 TEST_HEADER include/spdk/nvmf.h 00:02:06.434 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:06.434 TEST_HEADER include/spdk/nvmf_spec.h 00:02:06.434 TEST_HEADER include/spdk/nvmf_transport.h 00:02:06.434 TEST_HEADER include/spdk/opal.h 00:02:06.434 TEST_HEADER include/spdk/opal_spec.h 00:02:06.434 TEST_HEADER include/spdk/pci_ids.h 00:02:06.434 TEST_HEADER include/spdk/pipe.h 00:02:06.434 TEST_HEADER include/spdk/queue.h 00:02:06.434 TEST_HEADER include/spdk/rpc.h 00:02:06.434 TEST_HEADER include/spdk/reduce.h 00:02:06.434 TEST_HEADER include/spdk/scsi.h 00:02:06.697 TEST_HEADER include/spdk/scheduler.h 00:02:06.697 TEST_HEADER include/spdk/scsi_spec.h 00:02:06.697 TEST_HEADER include/spdk/sock.h 00:02:06.697 TEST_HEADER include/spdk/stdinc.h 00:02:06.697 TEST_HEADER include/spdk/string.h 00:02:06.697 TEST_HEADER include/spdk/thread.h 00:02:06.697 TEST_HEADER include/spdk/trace.h 00:02:06.697 TEST_HEADER include/spdk/tree.h 00:02:06.697 TEST_HEADER include/spdk/trace_parser.h 00:02:06.697 TEST_HEADER include/spdk/ublk.h 00:02:06.697 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:06.697 TEST_HEADER include/spdk/util.h 00:02:06.697 CC app/spdk_dd/spdk_dd.o 00:02:06.697 TEST_HEADER include/spdk/uuid.h 00:02:06.697 TEST_HEADER include/spdk/version.h 00:02:06.697 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:06.697 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:06.697 TEST_HEADER include/spdk/vmd.h 00:02:06.697 TEST_HEADER include/spdk/vhost.h 00:02:06.697 TEST_HEADER include/spdk/xor.h 00:02:06.697 TEST_HEADER include/spdk/zipf.h 00:02:06.697 CXX test/cpp_headers/accel.o 00:02:06.697 CXX test/cpp_headers/accel_module.o 00:02:06.697 CXX test/cpp_headers/assert.o 00:02:06.697 CXX test/cpp_headers/barrier.o 00:02:06.697 CXX test/cpp_headers/bdev.o 00:02:06.697 CXX test/cpp_headers/base64.o 00:02:06.697 CXX test/cpp_headers/bdev_module.o 00:02:06.697 CXX test/cpp_headers/bit_array.o 00:02:06.697 CXX test/cpp_headers/bdev_zone.o 00:02:06.697 CXX test/cpp_headers/bit_pool.o 00:02:06.697 CXX test/cpp_headers/blob_bdev.o 00:02:06.697 CXX test/cpp_headers/blobfs_bdev.o 00:02:06.697 CC app/spdk_tgt/spdk_tgt.o 00:02:06.697 CXX test/cpp_headers/blobfs.o 00:02:06.697 CXX test/cpp_headers/blob.o 00:02:06.697 CXX test/cpp_headers/conf.o 00:02:06.697 CXX test/cpp_headers/config.o 00:02:06.697 CXX test/cpp_headers/crc16.o 00:02:06.697 CXX test/cpp_headers/cpuset.o 00:02:06.697 CXX test/cpp_headers/crc32.o 00:02:06.697 CXX test/cpp_headers/dif.o 00:02:06.697 CXX test/cpp_headers/crc64.o 00:02:06.697 CXX test/cpp_headers/dma.o 00:02:06.697 CC app/nvmf_tgt/nvmf_main.o 00:02:06.697 CC app/vhost/vhost.o 00:02:06.697 CXX test/cpp_headers/endian.o 00:02:06.697 CXX test/cpp_headers/env_dpdk.o 00:02:06.697 CXX test/cpp_headers/env.o 00:02:06.697 CXX test/cpp_headers/event.o 00:02:06.697 CXX test/cpp_headers/fd_group.o 00:02:06.697 CXX test/cpp_headers/fd.o 00:02:06.697 CXX test/cpp_headers/file.o 00:02:06.697 CXX test/cpp_headers/ftl.o 00:02:06.697 CXX test/cpp_headers/hexlify.o 00:02:06.697 CXX test/cpp_headers/gpt_spec.o 00:02:06.697 CXX test/cpp_headers/histogram_data.o 00:02:06.697 CXX test/cpp_headers/idxd.o 00:02:06.697 CXX test/cpp_headers/idxd_spec.o 00:02:06.697 CXX test/cpp_headers/init.o 00:02:06.697 CC app/iscsi_tgt/iscsi_tgt.o 00:02:06.697 CC test/app/jsoncat/jsoncat.o 00:02:06.697 CC test/app/stub/stub.o 00:02:06.697 CC test/app/histogram_perf/histogram_perf.o 00:02:06.697 CC test/event/event_perf/event_perf.o 00:02:06.697 CC test/thread/poller_perf/poller_perf.o 00:02:06.697 CC test/thread/lock/spdk_lock.o 00:02:06.697 CC test/event/reactor_perf/reactor_perf.o 00:02:06.697 CC test/event/reactor/reactor.o 00:02:06.697 CC test/nvme/startup/startup.o 00:02:06.697 CC test/nvme/reset/reset.o 00:02:06.697 CC test/nvme/simple_copy/simple_copy.o 00:02:06.697 CC test/nvme/sgl/sgl.o 00:02:06.697 CXX test/cpp_headers/ioat.o 00:02:06.697 CC test/event/app_repeat/app_repeat.o 00:02:06.697 CC test/nvme/reserve/reserve.o 00:02:06.697 CC test/nvme/cuse/cuse.o 00:02:06.697 CC test/nvme/aer/aer.o 00:02:06.697 CC examples/ioat/perf/perf.o 00:02:06.697 CC examples/ioat/verify/verify.o 00:02:06.697 CC test/nvme/fused_ordering/fused_ordering.o 00:02:06.697 CC test/nvme/e2edp/nvme_dp.o 00:02:06.697 CC app/fio/nvme/fio_plugin.o 00:02:06.697 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:06.697 CC test/nvme/err_injection/err_injection.o 00:02:06.697 CC test/nvme/compliance/nvme_compliance.o 00:02:06.697 CC test/nvme/connect_stress/connect_stress.o 00:02:06.697 CC test/nvme/overhead/overhead.o 00:02:06.697 CC test/nvme/boot_partition/boot_partition.o 00:02:06.697 CC examples/nvme/hello_world/hello_world.o 00:02:06.697 CC test/env/vtophys/vtophys.o 00:02:06.697 CC examples/nvme/reconnect/reconnect.o 00:02:06.697 CC examples/nvme/arbitration/arbitration.o 00:02:06.697 CC test/nvme/fdp/fdp.o 00:02:06.697 CC test/env/memory/memory_ut.o 00:02:06.697 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:06.697 CC examples/util/zipf/zipf.o 00:02:06.697 CC examples/nvme/hotplug/hotplug.o 00:02:06.697 CC test/env/pci/pci_ut.o 00:02:06.697 CC examples/idxd/perf/perf.o 00:02:06.697 CC examples/vmd/led/led.o 00:02:06.697 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:06.697 CC examples/vmd/lsvmd/lsvmd.o 00:02:06.697 CC examples/nvme/abort/abort.o 00:02:06.697 CC examples/accel/perf/accel_perf.o 00:02:06.697 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:06.697 CC examples/sock/hello_world/hello_sock.o 00:02:06.697 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:06.697 CC test/accel/dif/dif.o 00:02:06.697 CC test/bdev/bdevio/bdevio.o 00:02:06.697 CC test/dma/test_dma/test_dma.o 00:02:06.697 LINK spdk_lspci 00:02:06.697 CC test/blobfs/mkfs/mkfs.o 00:02:06.697 CC examples/blob/hello_world/hello_blob.o 00:02:06.697 CC test/event/scheduler/scheduler.o 00:02:06.697 CC examples/blob/cli/blobcli.o 00:02:06.697 CC test/app/bdev_svc/bdev_svc.o 00:02:06.697 CC examples/bdev/bdevperf/bdevperf.o 00:02:06.697 CC examples/bdev/hello_world/hello_bdev.o 00:02:06.698 CC app/fio/bdev/fio_plugin.o 00:02:06.698 CC examples/nvmf/nvmf/nvmf.o 00:02:06.698 CC examples/thread/thread/thread_ex.o 00:02:06.698 CC test/lvol/esnap/esnap.o 00:02:06.698 LINK rpc_client_test 00:02:06.698 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:06.698 CC test/env/mem_callbacks/mem_callbacks.o 00:02:06.698 LINK spdk_nvme_discover 00:02:06.966 CXX test/cpp_headers/ioat_spec.o 00:02:06.966 LINK jsoncat 00:02:06.966 CXX test/cpp_headers/iscsi_spec.o 00:02:06.966 CXX test/cpp_headers/json.o 00:02:06.966 CXX test/cpp_headers/jsonrpc.o 00:02:06.966 LINK spdk_trace_record 00:02:06.966 CXX test/cpp_headers/likely.o 00:02:06.966 CXX test/cpp_headers/log.o 00:02:06.966 CXX test/cpp_headers/lvol.o 00:02:06.966 CXX test/cpp_headers/memory.o 00:02:06.966 CXX test/cpp_headers/mmio.o 00:02:06.966 CXX test/cpp_headers/nbd.o 00:02:06.966 CXX test/cpp_headers/notify.o 00:02:06.966 LINK interrupt_tgt 00:02:06.966 CXX test/cpp_headers/nvme.o 00:02:06.966 CXX test/cpp_headers/nvme_intel.o 00:02:06.966 CXX test/cpp_headers/nvme_ocssd.o 00:02:06.966 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:06.966 LINK reactor_perf 00:02:06.966 CXX test/cpp_headers/nvme_spec.o 00:02:06.966 LINK histogram_perf 00:02:06.966 CXX test/cpp_headers/nvme_zns.o 00:02:06.966 LINK poller_perf 00:02:06.966 CXX test/cpp_headers/nvmf_cmd.o 00:02:06.966 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:06.966 LINK event_perf 00:02:06.966 CXX test/cpp_headers/nvmf.o 00:02:06.966 CXX test/cpp_headers/nvmf_spec.o 00:02:06.966 CXX test/cpp_headers/nvmf_transport.o 00:02:06.966 LINK lsvmd 00:02:06.966 LINK reactor 00:02:06.966 CXX test/cpp_headers/opal.o 00:02:06.966 CXX test/cpp_headers/opal_spec.o 00:02:06.966 CXX test/cpp_headers/pci_ids.o 00:02:06.966 CXX test/cpp_headers/pipe.o 00:02:06.966 LINK vhost 00:02:06.966 CXX test/cpp_headers/queue.o 00:02:06.966 LINK nvmf_tgt 00:02:06.966 LINK app_repeat 00:02:06.966 CXX test/cpp_headers/reduce.o 00:02:06.966 LINK vtophys 00:02:06.966 LINK zipf 00:02:06.966 LINK stub 00:02:06.966 LINK led 00:02:06.966 LINK env_dpdk_post_init 00:02:06.966 CXX test/cpp_headers/rpc.o 00:02:06.966 CXX test/cpp_headers/scheduler.o 00:02:06.966 LINK spdk_tgt 00:02:06.966 CXX test/cpp_headers/scsi.o 00:02:06.966 CXX test/cpp_headers/scsi_spec.o 00:02:06.966 CXX test/cpp_headers/sock.o 00:02:06.966 LINK connect_stress 00:02:06.966 LINK startup 00:02:06.966 LINK err_injection 00:02:06.966 LINK boot_partition 00:02:06.966 LINK iscsi_tgt 00:02:06.966 LINK reserve 00:02:06.966 CXX test/cpp_headers/stdinc.o 00:02:06.966 LINK pmr_persistence 00:02:06.966 LINK doorbell_aers 00:02:06.966 LINK fused_ordering 00:02:06.966 LINK verify 00:02:06.966 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:06.966 LINK cmb_copy 00:02:06.966 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:06.966 LINK ioat_perf 00:02:06.966 LINK simple_copy 00:02:06.966 LINK hotplug 00:02:06.966 LINK hello_world 00:02:06.966 LINK mkfs 00:02:06.966 LINK hello_sock 00:02:06.966 LINK bdev_svc 00:02:06.966 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:06.966 LINK aer 00:02:06.966 LINK nvme_dp 00:02:06.966 LINK scheduler 00:02:06.966 LINK reset 00:02:06.966 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:06.966 LINK fdp 00:02:06.966 LINK sgl 00:02:06.966 LINK hello_blob 00:02:06.966 LINK overhead 00:02:06.966 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:06.966 LINK hello_bdev 00:02:06.966 CXX test/cpp_headers/string.o 00:02:06.966 CXX test/cpp_headers/thread.o 00:02:06.966 CXX test/cpp_headers/trace.o 00:02:06.966 CXX test/cpp_headers/trace_parser.o 00:02:06.966 CXX test/cpp_headers/tree.o 00:02:06.966 CXX test/cpp_headers/ublk.o 00:02:06.966 CXX test/cpp_headers/util.o 00:02:06.966 CXX test/cpp_headers/uuid.o 00:02:06.966 CXX test/cpp_headers/version.o 00:02:06.966 CXX test/cpp_headers/vfio_user_pci.o 00:02:06.966 CXX test/cpp_headers/vfio_user_spec.o 00:02:06.966 CXX test/cpp_headers/vhost.o 00:02:07.227 CXX test/cpp_headers/vmd.o 00:02:07.227 CXX test/cpp_headers/xor.o 00:02:07.227 LINK idxd_perf 00:02:07.227 CXX test/cpp_headers/zipf.o 00:02:07.227 LINK thread 00:02:07.227 LINK spdk_trace 00:02:07.227 LINK reconnect 00:02:07.227 LINK nvmf 00:02:07.227 LINK arbitration 00:02:07.227 LINK abort 00:02:07.227 LINK test_dma 00:02:07.227 LINK dif 00:02:07.227 LINK bdevio 00:02:07.227 LINK spdk_dd 00:02:07.227 LINK nvme_compliance 00:02:07.227 LINK pci_ut 00:02:07.227 LINK blobcli 00:02:07.227 LINK accel_perf 00:02:07.227 LINK nvme_manage 00:02:07.486 LINK nvme_fuzz 00:02:07.486 LINK spdk_nvme 00:02:07.486 LINK llvm_vfio_fuzz 00:02:07.486 LINK spdk_nvme_identify 00:02:07.486 LINK mem_callbacks 00:02:07.486 LINK spdk_bdev 00:02:07.486 LINK spdk_top 00:02:07.744 LINK vhost_fuzz 00:02:07.744 LINK cuse 00:02:07.744 LINK spdk_nvme_perf 00:02:07.744 LINK bdevperf 00:02:07.744 LINK memory_ut 00:02:08.003 LINK llvm_nvme_fuzz 00:02:08.003 LINK spdk_lock 00:02:08.570 LINK iscsi_fuzz 00:02:10.475 LINK esnap 00:02:10.735 00:02:10.735 real 0m41.597s 00:02:10.735 user 5m43.254s 00:02:10.735 sys 2m50.283s 00:02:10.735 15:17:56 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:10.735 15:17:56 -- common/autotest_common.sh@10 -- $ set +x 00:02:10.735 ************************************ 00:02:10.735 END TEST make 00:02:10.735 ************************************ 00:02:10.735 15:17:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:10.735 15:17:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:10.735 15:17:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:10.995 15:17:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:10.995 15:17:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:10.995 15:17:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:10.995 15:17:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:10.995 15:17:56 -- scripts/common.sh@335 -- # IFS=.-: 00:02:10.995 15:17:56 -- scripts/common.sh@335 -- # read -ra ver1 00:02:10.995 15:17:56 -- scripts/common.sh@336 -- # IFS=.-: 00:02:10.995 15:17:56 -- scripts/common.sh@336 -- # read -ra ver2 00:02:10.995 15:17:56 -- scripts/common.sh@337 -- # local 'op=<' 00:02:10.995 15:17:56 -- scripts/common.sh@339 -- # ver1_l=2 00:02:10.995 15:17:56 -- scripts/common.sh@340 -- # ver2_l=1 00:02:10.995 15:17:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:10.996 15:17:56 -- scripts/common.sh@343 -- # case "$op" in 00:02:10.996 15:17:56 -- scripts/common.sh@344 -- # : 1 00:02:10.996 15:17:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:10.996 15:17:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:10.996 15:17:56 -- scripts/common.sh@364 -- # decimal 1 00:02:10.996 15:17:56 -- scripts/common.sh@352 -- # local d=1 00:02:10.996 15:17:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:10.996 15:17:56 -- scripts/common.sh@354 -- # echo 1 00:02:10.996 15:17:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:10.996 15:17:56 -- scripts/common.sh@365 -- # decimal 2 00:02:10.996 15:17:56 -- scripts/common.sh@352 -- # local d=2 00:02:10.996 15:17:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:10.996 15:17:56 -- scripts/common.sh@354 -- # echo 2 00:02:10.996 15:17:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:10.996 15:17:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:10.996 15:17:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:10.996 15:17:56 -- scripts/common.sh@367 -- # return 0 00:02:10.996 15:17:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:10.996 15:17:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:10.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:10.996 --rc genhtml_branch_coverage=1 00:02:10.996 --rc genhtml_function_coverage=1 00:02:10.996 --rc genhtml_legend=1 00:02:10.996 --rc geninfo_all_blocks=1 00:02:10.996 --rc geninfo_unexecuted_blocks=1 00:02:10.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:10.996 ' 00:02:10.996 15:17:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:10.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:10.996 --rc genhtml_branch_coverage=1 00:02:10.996 --rc genhtml_function_coverage=1 00:02:10.996 --rc genhtml_legend=1 00:02:10.996 --rc geninfo_all_blocks=1 00:02:10.996 --rc geninfo_unexecuted_blocks=1 00:02:10.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:10.996 ' 00:02:10.996 15:17:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:10.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:10.996 --rc genhtml_branch_coverage=1 00:02:10.996 --rc genhtml_function_coverage=1 00:02:10.996 --rc genhtml_legend=1 00:02:10.996 --rc geninfo_all_blocks=1 00:02:10.996 --rc geninfo_unexecuted_blocks=1 00:02:10.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:10.996 ' 00:02:10.996 15:17:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:10.996 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:10.996 --rc genhtml_branch_coverage=1 00:02:10.996 --rc genhtml_function_coverage=1 00:02:10.996 --rc genhtml_legend=1 00:02:10.996 --rc geninfo_all_blocks=1 00:02:10.996 --rc geninfo_unexecuted_blocks=1 00:02:10.996 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:10.996 ' 00:02:10.996 15:17:56 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:10.996 15:17:56 -- nvmf/common.sh@7 -- # uname -s 00:02:10.996 15:17:56 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:10.996 15:17:56 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:10.996 15:17:56 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:10.996 15:17:56 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:10.996 15:17:56 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:10.996 15:17:56 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:10.996 15:17:56 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:10.996 15:17:56 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:10.996 15:17:56 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:10.996 15:17:56 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:10.996 15:17:56 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:10.996 15:17:56 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:10.996 15:17:56 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:10.996 15:17:56 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:10.996 15:17:56 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:10.996 15:17:56 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:10.996 15:17:56 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:10.996 15:17:56 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:10.996 15:17:56 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:10.996 15:17:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.996 15:17:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.996 15:17:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.996 15:17:56 -- paths/export.sh@5 -- # export PATH 00:02:10.996 15:17:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.996 15:17:56 -- nvmf/common.sh@46 -- # : 0 00:02:10.996 15:17:56 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:10.996 15:17:56 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:10.996 15:17:56 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:10.996 15:17:56 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:10.996 15:17:56 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:10.996 15:17:56 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:10.996 15:17:56 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:10.996 15:17:56 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:10.996 15:17:56 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:10.996 15:17:56 -- spdk/autotest.sh@32 -- # uname -s 00:02:10.996 15:17:56 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:10.996 15:17:56 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:10.996 15:17:56 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:10.996 15:17:56 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:10.996 15:17:56 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:10.996 15:17:56 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:10.996 15:17:56 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:10.996 15:17:56 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:10.996 15:17:56 -- spdk/autotest.sh@48 -- # udevadm_pid=1521413 00:02:10.996 15:17:56 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:10.996 15:17:56 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:10.996 15:17:56 -- spdk/autotest.sh@54 -- # echo 1521415 00:02:10.996 15:17:56 -- spdk/autotest.sh@56 -- # echo 1521416 00:02:10.996 15:17:56 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:10.996 15:17:56 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:10.996 15:17:56 -- spdk/autotest.sh@60 -- # echo 1521417 00:02:10.996 15:17:56 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:10.996 15:17:56 -- spdk/autotest.sh@62 -- # echo 1521418 00:02:10.996 15:17:56 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:10.996 15:17:56 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:10.996 15:17:56 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:10.996 15:17:56 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:10.996 15:17:56 -- common/autotest_common.sh@10 -- # set +x 00:02:10.996 15:17:56 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:10.996 15:17:56 -- spdk/autotest.sh@70 -- # create_test_list 00:02:10.996 15:17:56 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:10.996 15:17:56 -- common/autotest_common.sh@10 -- # set +x 00:02:10.996 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:10.996 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:10.996 15:17:56 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:10.996 15:17:56 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:10.996 15:17:56 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:10.996 15:17:56 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:10.996 15:17:56 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:10.996 15:17:56 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:10.996 15:17:56 -- common/autotest_common.sh@1450 -- # uname 00:02:10.996 15:17:56 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:02:10.996 15:17:56 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:10.996 15:17:56 -- common/autotest_common.sh@1470 -- # uname 00:02:10.996 15:17:56 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:02:10.996 15:17:56 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:02:10.996 15:17:56 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:02:10.996 lcov: LCOV version 1.15 00:02:10.997 15:17:56 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:02:12.907 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:02:12.907 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:02:12.907 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:02:25.225 15:18:10 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:02:25.225 15:18:10 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:25.225 15:18:10 -- common/autotest_common.sh@10 -- # set +x 00:02:25.225 15:18:10 -- spdk/autotest.sh@89 -- # rm -f 00:02:25.225 15:18:10 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:29.416 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:29.416 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:02:29.416 15:18:14 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:02:29.416 15:18:14 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:29.416 15:18:14 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:29.416 15:18:14 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:29.416 15:18:14 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:29.416 15:18:14 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:29.416 15:18:14 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:29.416 15:18:14 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:29.416 15:18:14 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:29.416 15:18:14 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:02:29.416 15:18:14 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:02:29.416 15:18:14 -- spdk/autotest.sh@108 -- # grep -v p 00:02:29.416 15:18:14 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:02:29.416 15:18:14 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:02:29.417 15:18:14 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:02:29.417 15:18:14 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:02:29.417 15:18:14 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:29.417 No valid GPT data, bailing 00:02:29.417 15:18:14 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:29.417 15:18:14 -- scripts/common.sh@393 -- # pt= 00:02:29.417 15:18:14 -- scripts/common.sh@394 -- # return 1 00:02:29.417 15:18:14 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:29.417 1+0 records in 00:02:29.417 1+0 records out 00:02:29.417 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00661843 s, 158 MB/s 00:02:29.417 15:18:14 -- spdk/autotest.sh@116 -- # sync 00:02:29.417 15:18:14 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:29.417 15:18:14 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:29.417 15:18:14 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:37.542 15:18:21 -- spdk/autotest.sh@122 -- # uname -s 00:02:37.542 15:18:21 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:02:37.542 15:18:21 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:37.542 15:18:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:37.542 15:18:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:37.542 15:18:21 -- common/autotest_common.sh@10 -- # set +x 00:02:37.542 ************************************ 00:02:37.542 START TEST setup.sh 00:02:37.542 ************************************ 00:02:37.542 15:18:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:37.542 * Looking for test storage... 00:02:37.542 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:37.542 15:18:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:37.542 15:18:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:37.542 15:18:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:37.542 15:18:21 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:37.542 15:18:21 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:37.542 15:18:21 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:37.542 15:18:21 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:37.542 15:18:21 -- scripts/common.sh@335 -- # IFS=.-: 00:02:37.542 15:18:21 -- scripts/common.sh@335 -- # read -ra ver1 00:02:37.542 15:18:21 -- scripts/common.sh@336 -- # IFS=.-: 00:02:37.542 15:18:21 -- scripts/common.sh@336 -- # read -ra ver2 00:02:37.542 15:18:21 -- scripts/common.sh@337 -- # local 'op=<' 00:02:37.542 15:18:21 -- scripts/common.sh@339 -- # ver1_l=2 00:02:37.542 15:18:21 -- scripts/common.sh@340 -- # ver2_l=1 00:02:37.542 15:18:21 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:37.542 15:18:21 -- scripts/common.sh@343 -- # case "$op" in 00:02:37.542 15:18:21 -- scripts/common.sh@344 -- # : 1 00:02:37.542 15:18:21 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:37.542 15:18:21 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:37.542 15:18:21 -- scripts/common.sh@364 -- # decimal 1 00:02:37.542 15:18:21 -- scripts/common.sh@352 -- # local d=1 00:02:37.542 15:18:21 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:37.542 15:18:21 -- scripts/common.sh@354 -- # echo 1 00:02:37.542 15:18:21 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:37.542 15:18:21 -- scripts/common.sh@365 -- # decimal 2 00:02:37.542 15:18:21 -- scripts/common.sh@352 -- # local d=2 00:02:37.542 15:18:21 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:37.542 15:18:21 -- scripts/common.sh@354 -- # echo 2 00:02:37.542 15:18:21 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:37.542 15:18:21 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:37.542 15:18:21 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:37.543 15:18:21 -- scripts/common.sh@367 -- # return 0 00:02:37.543 15:18:21 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:37.543 15:18:21 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:21 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:21 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:21 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:21 -- setup/test-setup.sh@10 -- # uname -s 00:02:37.543 15:18:21 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:37.543 15:18:21 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:37.543 15:18:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:37.543 15:18:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:37.543 15:18:21 -- common/autotest_common.sh@10 -- # set +x 00:02:37.543 ************************************ 00:02:37.543 START TEST acl 00:02:37.543 ************************************ 00:02:37.543 15:18:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:37.543 * Looking for test storage... 00:02:37.543 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:37.543 15:18:22 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:37.543 15:18:22 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:37.543 15:18:22 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:37.543 15:18:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:37.543 15:18:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:37.543 15:18:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:37.543 15:18:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:37.543 15:18:22 -- scripts/common.sh@335 -- # IFS=.-: 00:02:37.543 15:18:22 -- scripts/common.sh@335 -- # read -ra ver1 00:02:37.543 15:18:22 -- scripts/common.sh@336 -- # IFS=.-: 00:02:37.543 15:18:22 -- scripts/common.sh@336 -- # read -ra ver2 00:02:37.543 15:18:22 -- scripts/common.sh@337 -- # local 'op=<' 00:02:37.543 15:18:22 -- scripts/common.sh@339 -- # ver1_l=2 00:02:37.543 15:18:22 -- scripts/common.sh@340 -- # ver2_l=1 00:02:37.543 15:18:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:37.543 15:18:22 -- scripts/common.sh@343 -- # case "$op" in 00:02:37.543 15:18:22 -- scripts/common.sh@344 -- # : 1 00:02:37.543 15:18:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:37.543 15:18:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:37.543 15:18:22 -- scripts/common.sh@364 -- # decimal 1 00:02:37.543 15:18:22 -- scripts/common.sh@352 -- # local d=1 00:02:37.543 15:18:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:37.543 15:18:22 -- scripts/common.sh@354 -- # echo 1 00:02:37.543 15:18:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:37.543 15:18:22 -- scripts/common.sh@365 -- # decimal 2 00:02:37.543 15:18:22 -- scripts/common.sh@352 -- # local d=2 00:02:37.543 15:18:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:37.543 15:18:22 -- scripts/common.sh@354 -- # echo 2 00:02:37.543 15:18:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:37.543 15:18:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:37.543 15:18:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:37.543 15:18:22 -- scripts/common.sh@367 -- # return 0 00:02:37.543 15:18:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:37.543 15:18:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:37.543 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:37.543 --rc genhtml_branch_coverage=1 00:02:37.543 --rc genhtml_function_coverage=1 00:02:37.543 --rc genhtml_legend=1 00:02:37.543 --rc geninfo_all_blocks=1 00:02:37.543 --rc geninfo_unexecuted_blocks=1 00:02:37.543 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:37.543 ' 00:02:37.543 15:18:22 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:37.543 15:18:22 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:37.543 15:18:22 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:37.543 15:18:22 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:37.543 15:18:22 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:37.543 15:18:22 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:37.543 15:18:22 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:37.543 15:18:22 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:37.543 15:18:22 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:37.543 15:18:22 -- setup/acl.sh@12 -- # devs=() 00:02:37.543 15:18:22 -- setup/acl.sh@12 -- # declare -a devs 00:02:37.543 15:18:22 -- setup/acl.sh@13 -- # drivers=() 00:02:37.543 15:18:22 -- setup/acl.sh@13 -- # declare -A drivers 00:02:37.543 15:18:22 -- setup/acl.sh@51 -- # setup reset 00:02:37.543 15:18:22 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:37.543 15:18:22 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:40.085 15:18:25 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:40.085 15:18:25 -- setup/acl.sh@16 -- # local dev driver 00:02:40.085 15:18:25 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.085 15:18:25 -- setup/acl.sh@15 -- # setup output status 00:02:40.085 15:18:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:40.085 15:18:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:42.623 Hugepages 00:02:42.623 node hugesize free / total 00:02:42.623 15:18:28 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:42.623 15:18:28 -- setup/acl.sh@19 -- # continue 00:02:42.623 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 00:02:42.884 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # continue 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:42.884 15:18:28 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:02:42.884 15:18:28 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:42.884 15:18:28 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:42.884 15:18:28 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:42.884 15:18:28 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:42.884 15:18:28 -- setup/acl.sh@54 -- # run_test denied denied 00:02:42.884 15:18:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:42.884 15:18:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:42.884 15:18:28 -- common/autotest_common.sh@10 -- # set +x 00:02:42.884 ************************************ 00:02:42.884 START TEST denied 00:02:42.884 ************************************ 00:02:42.884 15:18:28 -- common/autotest_common.sh@1114 -- # denied 00:02:42.884 15:18:28 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:02:42.884 15:18:28 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:02:42.884 15:18:28 -- setup/acl.sh@38 -- # setup output config 00:02:42.884 15:18:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:42.884 15:18:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:47.082 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:02:47.082 15:18:32 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:02:47.082 15:18:32 -- setup/acl.sh@28 -- # local dev driver 00:02:47.082 15:18:32 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:47.082 15:18:32 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:02:47.082 15:18:32 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:02:47.082 15:18:32 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:47.082 15:18:32 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:47.082 15:18:32 -- setup/acl.sh@41 -- # setup reset 00:02:47.082 15:18:32 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:47.082 15:18:32 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:51.274 00:02:51.274 real 0m8.110s 00:02:51.274 user 0m2.580s 00:02:51.274 sys 0m4.903s 00:02:51.274 15:18:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:02:51.274 15:18:36 -- common/autotest_common.sh@10 -- # set +x 00:02:51.274 ************************************ 00:02:51.274 END TEST denied 00:02:51.274 ************************************ 00:02:51.274 15:18:36 -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:51.274 15:18:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:51.274 15:18:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:51.274 15:18:36 -- common/autotest_common.sh@10 -- # set +x 00:02:51.274 ************************************ 00:02:51.274 START TEST allowed 00:02:51.274 ************************************ 00:02:51.274 15:18:36 -- common/autotest_common.sh@1114 -- # allowed 00:02:51.274 15:18:36 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:02:51.274 15:18:36 -- setup/acl.sh@45 -- # setup output config 00:02:51.274 15:18:36 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:02:51.274 15:18:36 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:51.274 15:18:36 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:56.546 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:02:56.546 15:18:41 -- setup/acl.sh@47 -- # verify 00:02:56.546 15:18:41 -- setup/acl.sh@28 -- # local dev driver 00:02:56.546 15:18:41 -- setup/acl.sh@48 -- # setup reset 00:02:56.546 15:18:41 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:56.546 15:18:41 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:59.830 00:02:59.830 real 0m8.578s 00:02:59.830 user 0m2.477s 00:02:59.830 sys 0m4.733s 00:02:59.830 15:18:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:02:59.830 15:18:45 -- common/autotest_common.sh@10 -- # set +x 00:02:59.830 ************************************ 00:02:59.830 END TEST allowed 00:02:59.830 ************************************ 00:02:59.830 00:02:59.830 real 0m23.389s 00:02:59.830 user 0m7.365s 00:02:59.830 sys 0m14.220s 00:02:59.830 15:18:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:02:59.830 15:18:45 -- common/autotest_common.sh@10 -- # set +x 00:02:59.830 ************************************ 00:02:59.830 END TEST acl 00:02:59.830 ************************************ 00:02:59.830 15:18:45 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:02:59.830 15:18:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:59.830 15:18:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:59.830 15:18:45 -- common/autotest_common.sh@10 -- # set +x 00:02:59.830 ************************************ 00:02:59.830 START TEST hugepages 00:02:59.830 ************************************ 00:02:59.830 15:18:45 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:02:59.830 * Looking for test storage... 00:02:59.830 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:59.830 15:18:45 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:59.830 15:18:45 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:59.830 15:18:45 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:59.830 15:18:45 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:59.830 15:18:45 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:59.830 15:18:45 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:59.830 15:18:45 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:59.830 15:18:45 -- scripts/common.sh@335 -- # IFS=.-: 00:02:59.830 15:18:45 -- scripts/common.sh@335 -- # read -ra ver1 00:02:59.830 15:18:45 -- scripts/common.sh@336 -- # IFS=.-: 00:02:59.830 15:18:45 -- scripts/common.sh@336 -- # read -ra ver2 00:02:59.830 15:18:45 -- scripts/common.sh@337 -- # local 'op=<' 00:02:59.830 15:18:45 -- scripts/common.sh@339 -- # ver1_l=2 00:02:59.830 15:18:45 -- scripts/common.sh@340 -- # ver2_l=1 00:02:59.830 15:18:45 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:59.830 15:18:45 -- scripts/common.sh@343 -- # case "$op" in 00:02:59.830 15:18:45 -- scripts/common.sh@344 -- # : 1 00:02:59.830 15:18:45 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:59.830 15:18:45 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:59.830 15:18:45 -- scripts/common.sh@364 -- # decimal 1 00:02:59.830 15:18:45 -- scripts/common.sh@352 -- # local d=1 00:02:59.830 15:18:45 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:59.830 15:18:45 -- scripts/common.sh@354 -- # echo 1 00:02:59.830 15:18:45 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:59.830 15:18:45 -- scripts/common.sh@365 -- # decimal 2 00:02:59.830 15:18:45 -- scripts/common.sh@352 -- # local d=2 00:02:59.830 15:18:45 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:59.830 15:18:45 -- scripts/common.sh@354 -- # echo 2 00:02:59.830 15:18:45 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:59.830 15:18:45 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:59.830 15:18:45 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:59.830 15:18:45 -- scripts/common.sh@367 -- # return 0 00:02:59.830 15:18:45 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:59.830 15:18:45 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:59.830 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:59.830 --rc genhtml_branch_coverage=1 00:02:59.830 --rc genhtml_function_coverage=1 00:02:59.830 --rc genhtml_legend=1 00:02:59.830 --rc geninfo_all_blocks=1 00:02:59.830 --rc geninfo_unexecuted_blocks=1 00:02:59.830 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:59.830 ' 00:02:59.830 15:18:45 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:59.830 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:59.830 --rc genhtml_branch_coverage=1 00:02:59.830 --rc genhtml_function_coverage=1 00:02:59.830 --rc genhtml_legend=1 00:02:59.830 --rc geninfo_all_blocks=1 00:02:59.830 --rc geninfo_unexecuted_blocks=1 00:02:59.830 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:59.830 ' 00:02:59.830 15:18:45 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:59.830 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:59.830 --rc genhtml_branch_coverage=1 00:02:59.830 --rc genhtml_function_coverage=1 00:02:59.830 --rc genhtml_legend=1 00:02:59.830 --rc geninfo_all_blocks=1 00:02:59.830 --rc geninfo_unexecuted_blocks=1 00:02:59.830 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:59.830 ' 00:02:59.830 15:18:45 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:59.830 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:59.830 --rc genhtml_branch_coverage=1 00:02:59.830 --rc genhtml_function_coverage=1 00:02:59.830 --rc genhtml_legend=1 00:02:59.830 --rc geninfo_all_blocks=1 00:02:59.830 --rc geninfo_unexecuted_blocks=1 00:02:59.830 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:59.830 ' 00:02:59.830 15:18:45 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:02:59.830 15:18:45 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:02:59.830 15:18:45 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:02:59.830 15:18:45 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:02:59.830 15:18:45 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:02:59.830 15:18:45 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:02:59.830 15:18:45 -- setup/common.sh@17 -- # local get=Hugepagesize 00:02:59.830 15:18:45 -- setup/common.sh@18 -- # local node= 00:02:59.830 15:18:45 -- setup/common.sh@19 -- # local var val 00:02:59.830 15:18:45 -- setup/common.sh@20 -- # local mem_f mem 00:02:59.830 15:18:45 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:02:59.830 15:18:45 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:02:59.830 15:18:45 -- setup/common.sh@25 -- # [[ -n '' ]] 00:02:59.830 15:18:45 -- setup/common.sh@28 -- # mapfile -t mem 00:02:59.830 15:18:45 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 40758156 kB' 'MemAvailable: 44479304 kB' 'Buffers: 8940 kB' 'Cached: 11211176 kB' 'SwapCached: 0 kB' 'Active: 7999832 kB' 'Inactive: 3688224 kB' 'Active(anon): 7582452 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 471324 kB' 'Mapped: 173088 kB' 'Shmem: 7114512 kB' 'KReclaimable: 225272 kB' 'Slab: 923312 kB' 'SReclaimable: 225272 kB' 'SUnreclaim: 698040 kB' 'KernelStack: 21856 kB' 'PageTables: 8000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433332 kB' 'Committed_AS: 8795876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214064 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:59.830 15:18:45 -- setup/common.sh@32 -- # continue 00:02:59.830 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # continue 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # IFS=': ' 00:03:00.089 15:18:45 -- setup/common.sh@31 -- # read -r var val _ 00:03:00.089 15:18:45 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:00.089 15:18:45 -- setup/common.sh@33 -- # echo 2048 00:03:00.089 15:18:45 -- setup/common.sh@33 -- # return 0 00:03:00.089 15:18:45 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:00.089 15:18:45 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:00.089 15:18:45 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:00.089 15:18:45 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:00.089 15:18:45 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:00.089 15:18:45 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:00.089 15:18:45 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:00.090 15:18:45 -- setup/hugepages.sh@207 -- # get_nodes 00:03:00.090 15:18:45 -- setup/hugepages.sh@27 -- # local node 00:03:00.090 15:18:45 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:00.090 15:18:45 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:00.090 15:18:45 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:00.090 15:18:45 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:00.090 15:18:45 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:00.090 15:18:45 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:00.090 15:18:45 -- setup/hugepages.sh@208 -- # clear_hp 00:03:00.090 15:18:45 -- setup/hugepages.sh@37 -- # local node hp 00:03:00.090 15:18:45 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:00.090 15:18:45 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.090 15:18:45 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.090 15:18:45 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:00.090 15:18:45 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.090 15:18:45 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:00.090 15:18:45 -- setup/hugepages.sh@41 -- # echo 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:00.090 15:18:45 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:00.090 15:18:45 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:00.090 15:18:45 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:00.090 15:18:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:00.090 15:18:45 -- common/autotest_common.sh@10 -- # set +x 00:03:00.090 ************************************ 00:03:00.090 START TEST default_setup 00:03:00.090 ************************************ 00:03:00.090 15:18:45 -- common/autotest_common.sh@1114 -- # default_setup 00:03:00.090 15:18:45 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:00.090 15:18:45 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:00.090 15:18:45 -- setup/hugepages.sh@51 -- # shift 00:03:00.090 15:18:45 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:00.090 15:18:45 -- setup/hugepages.sh@52 -- # local node_ids 00:03:00.090 15:18:45 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:00.090 15:18:45 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:00.090 15:18:45 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:00.090 15:18:45 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:00.090 15:18:45 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:00.090 15:18:45 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:00.090 15:18:45 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:00.090 15:18:45 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:00.090 15:18:45 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:00.090 15:18:45 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:00.090 15:18:45 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:00.090 15:18:45 -- setup/hugepages.sh@73 -- # return 0 00:03:00.090 15:18:45 -- setup/hugepages.sh@137 -- # setup output 00:03:00.090 15:18:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:00.090 15:18:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:03.376 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:03.376 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:03.636 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:05.549 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:05.549 15:18:50 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:05.549 15:18:50 -- setup/hugepages.sh@89 -- # local node 00:03:05.549 15:18:50 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:05.549 15:18:50 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:05.549 15:18:50 -- setup/hugepages.sh@92 -- # local surp 00:03:05.549 15:18:50 -- setup/hugepages.sh@93 -- # local resv 00:03:05.549 15:18:50 -- setup/hugepages.sh@94 -- # local anon 00:03:05.549 15:18:50 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:05.549 15:18:50 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:05.549 15:18:50 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:05.549 15:18:50 -- setup/common.sh@18 -- # local node= 00:03:05.549 15:18:50 -- setup/common.sh@19 -- # local var val 00:03:05.549 15:18:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:05.549 15:18:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:05.549 15:18:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:05.549 15:18:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:05.549 15:18:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:05.549 15:18:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.549 15:18:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42924508 kB' 'MemAvailable: 46645304 kB' 'Buffers: 8940 kB' 'Cached: 11211324 kB' 'SwapCached: 0 kB' 'Active: 8002912 kB' 'Inactive: 3688224 kB' 'Active(anon): 7585532 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474456 kB' 'Mapped: 173152 kB' 'Shmem: 7114660 kB' 'KReclaimable: 224568 kB' 'Slab: 921152 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696584 kB' 'KernelStack: 21968 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8798368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:05.549 15:18:50 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.549 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.549 15:18:50 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.549 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.549 15:18:50 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.549 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.549 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.550 15:18:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:05.550 15:18:50 -- setup/common.sh@33 -- # echo 0 00:03:05.550 15:18:50 -- setup/common.sh@33 -- # return 0 00:03:05.550 15:18:50 -- setup/hugepages.sh@97 -- # anon=0 00:03:05.550 15:18:50 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:05.550 15:18:50 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:05.550 15:18:50 -- setup/common.sh@18 -- # local node= 00:03:05.550 15:18:50 -- setup/common.sh@19 -- # local var val 00:03:05.550 15:18:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:05.550 15:18:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:05.550 15:18:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:05.550 15:18:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:05.550 15:18:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:05.550 15:18:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:05.550 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42933956 kB' 'MemAvailable: 46654752 kB' 'Buffers: 8940 kB' 'Cached: 11211324 kB' 'SwapCached: 0 kB' 'Active: 8002656 kB' 'Inactive: 3688224 kB' 'Active(anon): 7585276 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474252 kB' 'Mapped: 173140 kB' 'Shmem: 7114660 kB' 'KReclaimable: 224568 kB' 'Slab: 921104 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696536 kB' 'KernelStack: 21952 kB' 'PageTables: 8104 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8799892 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.551 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.551 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.552 15:18:50 -- setup/common.sh@33 -- # echo 0 00:03:05.552 15:18:50 -- setup/common.sh@33 -- # return 0 00:03:05.552 15:18:50 -- setup/hugepages.sh@99 -- # surp=0 00:03:05.552 15:18:50 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:05.552 15:18:50 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:05.552 15:18:50 -- setup/common.sh@18 -- # local node= 00:03:05.552 15:18:50 -- setup/common.sh@19 -- # local var val 00:03:05.552 15:18:50 -- setup/common.sh@20 -- # local mem_f mem 00:03:05.552 15:18:50 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:05.552 15:18:50 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:05.552 15:18:50 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:05.552 15:18:50 -- setup/common.sh@28 -- # mapfile -t mem 00:03:05.552 15:18:50 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42934640 kB' 'MemAvailable: 46655436 kB' 'Buffers: 8940 kB' 'Cached: 11211336 kB' 'SwapCached: 0 kB' 'Active: 8001884 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584504 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473404 kB' 'Mapped: 173140 kB' 'Shmem: 7114672 kB' 'KReclaimable: 224568 kB' 'Slab: 921132 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696564 kB' 'KernelStack: 21968 kB' 'PageTables: 8096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8799904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.552 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.552 15:18:50 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:50 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:50 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.553 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.553 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:05.553 15:18:51 -- setup/common.sh@33 -- # echo 0 00:03:05.553 15:18:51 -- setup/common.sh@33 -- # return 0 00:03:05.553 15:18:51 -- setup/hugepages.sh@100 -- # resv=0 00:03:05.553 15:18:51 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:05.553 nr_hugepages=1024 00:03:05.553 15:18:51 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:05.553 resv_hugepages=0 00:03:05.553 15:18:51 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:05.553 surplus_hugepages=0 00:03:05.553 15:18:51 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:05.553 anon_hugepages=0 00:03:05.554 15:18:51 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:05.554 15:18:51 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:05.554 15:18:51 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:05.554 15:18:51 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:05.554 15:18:51 -- setup/common.sh@18 -- # local node= 00:03:05.554 15:18:51 -- setup/common.sh@19 -- # local var val 00:03:05.554 15:18:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:05.554 15:18:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:05.554 15:18:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:05.554 15:18:51 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:05.554 15:18:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:05.554 15:18:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42936980 kB' 'MemAvailable: 46657776 kB' 'Buffers: 8940 kB' 'Cached: 11211352 kB' 'SwapCached: 0 kB' 'Active: 8002388 kB' 'Inactive: 3688224 kB' 'Active(anon): 7585008 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474020 kB' 'Mapped: 173140 kB' 'Shmem: 7114688 kB' 'KReclaimable: 224568 kB' 'Slab: 921132 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696564 kB' 'KernelStack: 21904 kB' 'PageTables: 7740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8798408 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.554 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.554 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:05.555 15:18:51 -- setup/common.sh@33 -- # echo 1024 00:03:05.555 15:18:51 -- setup/common.sh@33 -- # return 0 00:03:05.555 15:18:51 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:05.555 15:18:51 -- setup/hugepages.sh@112 -- # get_nodes 00:03:05.555 15:18:51 -- setup/hugepages.sh@27 -- # local node 00:03:05.555 15:18:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:05.555 15:18:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:05.555 15:18:51 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:05.555 15:18:51 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:05.555 15:18:51 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:05.555 15:18:51 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:05.555 15:18:51 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:05.555 15:18:51 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:05.555 15:18:51 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:05.555 15:18:51 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:05.555 15:18:51 -- setup/common.sh@18 -- # local node=0 00:03:05.555 15:18:51 -- setup/common.sh@19 -- # local var val 00:03:05.555 15:18:51 -- setup/common.sh@20 -- # local mem_f mem 00:03:05.555 15:18:51 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:05.555 15:18:51 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:05.555 15:18:51 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:05.555 15:18:51 -- setup/common.sh@28 -- # mapfile -t mem 00:03:05.555 15:18:51 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26327324 kB' 'MemUsed: 6307112 kB' 'SwapCached: 0 kB' 'Active: 3020624 kB' 'Inactive: 134724 kB' 'Active(anon): 2684244 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999268 kB' 'Mapped: 145648 kB' 'AnonPages: 159584 kB' 'Shmem: 2528164 kB' 'KernelStack: 12344 kB' 'PageTables: 4344 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122404 kB' 'Slab: 457924 kB' 'SReclaimable: 122404 kB' 'SUnreclaim: 335520 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.555 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.555 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # continue 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # IFS=': ' 00:03:05.556 15:18:51 -- setup/common.sh@31 -- # read -r var val _ 00:03:05.556 15:18:51 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:05.556 15:18:51 -- setup/common.sh@33 -- # echo 0 00:03:05.556 15:18:51 -- setup/common.sh@33 -- # return 0 00:03:05.556 15:18:51 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:05.556 15:18:51 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:05.556 15:18:51 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:05.556 15:18:51 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:05.556 15:18:51 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:05.556 node0=1024 expecting 1024 00:03:05.556 15:18:51 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:05.556 00:03:05.556 real 0m5.351s 00:03:05.556 user 0m1.431s 00:03:05.556 sys 0m2.419s 00:03:05.556 15:18:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:05.556 15:18:51 -- common/autotest_common.sh@10 -- # set +x 00:03:05.556 ************************************ 00:03:05.556 END TEST default_setup 00:03:05.556 ************************************ 00:03:05.556 15:18:51 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:05.556 15:18:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:05.556 15:18:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:05.556 15:18:51 -- common/autotest_common.sh@10 -- # set +x 00:03:05.556 ************************************ 00:03:05.556 START TEST per_node_1G_alloc 00:03:05.556 ************************************ 00:03:05.556 15:18:51 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:05.556 15:18:51 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:05.556 15:18:51 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:05.556 15:18:51 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:05.556 15:18:51 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:05.556 15:18:51 -- setup/hugepages.sh@51 -- # shift 00:03:05.556 15:18:51 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:05.556 15:18:51 -- setup/hugepages.sh@52 -- # local node_ids 00:03:05.556 15:18:51 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:05.556 15:18:51 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:05.556 15:18:51 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:05.556 15:18:51 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:05.556 15:18:51 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:05.556 15:18:51 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:05.556 15:18:51 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:05.556 15:18:51 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:05.556 15:18:51 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:05.556 15:18:51 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:05.556 15:18:51 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:05.556 15:18:51 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:05.556 15:18:51 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:05.556 15:18:51 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:05.556 15:18:51 -- setup/hugepages.sh@73 -- # return 0 00:03:05.556 15:18:51 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:05.556 15:18:51 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:05.556 15:18:51 -- setup/hugepages.sh@146 -- # setup output 00:03:05.556 15:18:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:05.556 15:18:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:08.923 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:08.923 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:08.924 15:18:54 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:08.924 15:18:54 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:08.924 15:18:54 -- setup/hugepages.sh@89 -- # local node 00:03:08.924 15:18:54 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:08.924 15:18:54 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:08.924 15:18:54 -- setup/hugepages.sh@92 -- # local surp 00:03:08.924 15:18:54 -- setup/hugepages.sh@93 -- # local resv 00:03:08.924 15:18:54 -- setup/hugepages.sh@94 -- # local anon 00:03:08.924 15:18:54 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:08.924 15:18:54 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:08.924 15:18:54 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:08.924 15:18:54 -- setup/common.sh@18 -- # local node= 00:03:08.924 15:18:54 -- setup/common.sh@19 -- # local var val 00:03:08.924 15:18:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.924 15:18:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.924 15:18:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.924 15:18:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.924 15:18:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.924 15:18:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42972036 kB' 'MemAvailable: 46692832 kB' 'Buffers: 8940 kB' 'Cached: 11211440 kB' 'SwapCached: 0 kB' 'Active: 8002116 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584736 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473304 kB' 'Mapped: 173232 kB' 'Shmem: 7114776 kB' 'KReclaimable: 224568 kB' 'Slab: 920828 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696260 kB' 'KernelStack: 21824 kB' 'PageTables: 7812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8796216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.924 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.924 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:08.925 15:18:54 -- setup/common.sh@33 -- # echo 0 00:03:08.925 15:18:54 -- setup/common.sh@33 -- # return 0 00:03:08.925 15:18:54 -- setup/hugepages.sh@97 -- # anon=0 00:03:08.925 15:18:54 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:08.925 15:18:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:08.925 15:18:54 -- setup/common.sh@18 -- # local node= 00:03:08.925 15:18:54 -- setup/common.sh@19 -- # local var val 00:03:08.925 15:18:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.925 15:18:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.925 15:18:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.925 15:18:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.925 15:18:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.925 15:18:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42971040 kB' 'MemAvailable: 46691836 kB' 'Buffers: 8940 kB' 'Cached: 11211444 kB' 'SwapCached: 0 kB' 'Active: 8001864 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584484 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473040 kB' 'Mapped: 173216 kB' 'Shmem: 7114780 kB' 'KReclaimable: 224568 kB' 'Slab: 920852 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696284 kB' 'KernelStack: 21792 kB' 'PageTables: 7708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8796228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.925 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.925 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.926 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.926 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.927 15:18:54 -- setup/common.sh@33 -- # echo 0 00:03:08.927 15:18:54 -- setup/common.sh@33 -- # return 0 00:03:08.927 15:18:54 -- setup/hugepages.sh@99 -- # surp=0 00:03:08.927 15:18:54 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:08.927 15:18:54 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:08.927 15:18:54 -- setup/common.sh@18 -- # local node= 00:03:08.927 15:18:54 -- setup/common.sh@19 -- # local var val 00:03:08.927 15:18:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.927 15:18:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.927 15:18:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.927 15:18:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.927 15:18:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.927 15:18:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42971928 kB' 'MemAvailable: 46692724 kB' 'Buffers: 8940 kB' 'Cached: 11211444 kB' 'SwapCached: 0 kB' 'Active: 8001896 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584516 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473044 kB' 'Mapped: 173140 kB' 'Shmem: 7114780 kB' 'KReclaimable: 224568 kB' 'Slab: 920868 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696300 kB' 'KernelStack: 21808 kB' 'PageTables: 7744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8796244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.927 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.927 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.928 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.928 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:08.929 15:18:54 -- setup/common.sh@33 -- # echo 0 00:03:08.929 15:18:54 -- setup/common.sh@33 -- # return 0 00:03:08.929 15:18:54 -- setup/hugepages.sh@100 -- # resv=0 00:03:08.929 15:18:54 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:08.929 nr_hugepages=1024 00:03:08.929 15:18:54 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:08.929 resv_hugepages=0 00:03:08.929 15:18:54 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:08.929 surplus_hugepages=0 00:03:08.929 15:18:54 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:08.929 anon_hugepages=0 00:03:08.929 15:18:54 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:08.929 15:18:54 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:08.929 15:18:54 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:08.929 15:18:54 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:08.929 15:18:54 -- setup/common.sh@18 -- # local node= 00:03:08.929 15:18:54 -- setup/common.sh@19 -- # local var val 00:03:08.929 15:18:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.929 15:18:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.929 15:18:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:08.929 15:18:54 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:08.929 15:18:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.929 15:18:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.929 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.929 15:18:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42971928 kB' 'MemAvailable: 46692724 kB' 'Buffers: 8940 kB' 'Cached: 11211444 kB' 'SwapCached: 0 kB' 'Active: 8001896 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584516 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473044 kB' 'Mapped: 173140 kB' 'Shmem: 7114780 kB' 'KReclaimable: 224568 kB' 'Slab: 920868 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696300 kB' 'KernelStack: 21808 kB' 'PageTables: 7744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8796256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.930 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.930 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.931 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.931 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:08.931 15:18:54 -- setup/common.sh@33 -- # echo 1024 00:03:08.931 15:18:54 -- setup/common.sh@33 -- # return 0 00:03:08.931 15:18:54 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:08.931 15:18:54 -- setup/hugepages.sh@112 -- # get_nodes 00:03:08.931 15:18:54 -- setup/hugepages.sh@27 -- # local node 00:03:08.931 15:18:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:08.931 15:18:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:08.931 15:18:54 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:08.931 15:18:54 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:08.931 15:18:54 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:08.931 15:18:54 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:08.931 15:18:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:08.931 15:18:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:08.931 15:18:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:08.932 15:18:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:08.932 15:18:54 -- setup/common.sh@18 -- # local node=0 00:03:08.932 15:18:54 -- setup/common.sh@19 -- # local var val 00:03:08.932 15:18:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.932 15:18:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.932 15:18:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:08.932 15:18:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:08.932 15:18:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.932 15:18:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 27390884 kB' 'MemUsed: 5243552 kB' 'SwapCached: 0 kB' 'Active: 3020484 kB' 'Inactive: 134724 kB' 'Active(anon): 2684104 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999332 kB' 'Mapped: 145648 kB' 'AnonPages: 159036 kB' 'Shmem: 2528228 kB' 'KernelStack: 12328 kB' 'PageTables: 4276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122404 kB' 'Slab: 457624 kB' 'SReclaimable: 122404 kB' 'SUnreclaim: 335220 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.932 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.932 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@33 -- # echo 0 00:03:08.933 15:18:54 -- setup/common.sh@33 -- # return 0 00:03:08.933 15:18:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:08.933 15:18:54 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:08.933 15:18:54 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:08.933 15:18:54 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:08.933 15:18:54 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:08.933 15:18:54 -- setup/common.sh@18 -- # local node=1 00:03:08.933 15:18:54 -- setup/common.sh@19 -- # local var val 00:03:08.933 15:18:54 -- setup/common.sh@20 -- # local mem_f mem 00:03:08.933 15:18:54 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:08.933 15:18:54 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:08.933 15:18:54 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:08.933 15:18:54 -- setup/common.sh@28 -- # mapfile -t mem 00:03:08.933 15:18:54 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649328 kB' 'MemFree: 15581392 kB' 'MemUsed: 12067936 kB' 'SwapCached: 0 kB' 'Active: 4981756 kB' 'Inactive: 3553500 kB' 'Active(anon): 4900756 kB' 'Inactive(anon): 0 kB' 'Active(file): 81000 kB' 'Inactive(file): 3553500 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8221104 kB' 'Mapped: 27492 kB' 'AnonPages: 314320 kB' 'Shmem: 4586604 kB' 'KernelStack: 9480 kB' 'PageTables: 3468 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 102164 kB' 'Slab: 463244 kB' 'SReclaimable: 102164 kB' 'SUnreclaim: 361080 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.933 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.933 15:18:54 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # continue 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # IFS=': ' 00:03:08.934 15:18:54 -- setup/common.sh@31 -- # read -r var val _ 00:03:08.934 15:18:54 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:08.934 15:18:54 -- setup/common.sh@33 -- # echo 0 00:03:08.934 15:18:54 -- setup/common.sh@33 -- # return 0 00:03:08.934 15:18:54 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:08.934 15:18:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:08.934 15:18:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:08.934 15:18:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:08.934 15:18:54 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:08.934 node0=512 expecting 512 00:03:08.934 15:18:54 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:08.934 15:18:54 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:08.934 15:18:54 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:08.934 15:18:54 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:08.934 node1=512 expecting 512 00:03:08.934 15:18:54 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:08.934 00:03:08.934 real 0m3.530s 00:03:08.934 user 0m1.283s 00:03:08.934 sys 0m2.297s 00:03:08.934 15:18:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:08.934 15:18:54 -- common/autotest_common.sh@10 -- # set +x 00:03:08.935 ************************************ 00:03:08.935 END TEST per_node_1G_alloc 00:03:08.935 ************************************ 00:03:08.935 15:18:54 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:08.935 15:18:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:08.935 15:18:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:08.935 15:18:54 -- common/autotest_common.sh@10 -- # set +x 00:03:08.935 ************************************ 00:03:08.935 START TEST even_2G_alloc 00:03:08.935 ************************************ 00:03:08.935 15:18:54 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:03:08.935 15:18:54 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:08.935 15:18:54 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:08.935 15:18:54 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:08.935 15:18:54 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:08.935 15:18:54 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:08.935 15:18:54 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:08.935 15:18:54 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:08.935 15:18:54 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:08.935 15:18:54 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:08.935 15:18:54 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:08.935 15:18:54 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:08.935 15:18:54 -- setup/hugepages.sh@83 -- # : 512 00:03:08.935 15:18:54 -- setup/hugepages.sh@84 -- # : 1 00:03:08.935 15:18:54 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:08.935 15:18:54 -- setup/hugepages.sh@83 -- # : 0 00:03:08.935 15:18:54 -- setup/hugepages.sh@84 -- # : 0 00:03:08.935 15:18:54 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:08.935 15:18:54 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:08.935 15:18:54 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:08.935 15:18:54 -- setup/hugepages.sh@153 -- # setup output 00:03:08.935 15:18:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:08.935 15:18:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:12.229 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:12.229 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:12.229 15:18:58 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:12.229 15:18:58 -- setup/hugepages.sh@89 -- # local node 00:03:12.229 15:18:58 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:12.229 15:18:58 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:12.229 15:18:58 -- setup/hugepages.sh@92 -- # local surp 00:03:12.229 15:18:58 -- setup/hugepages.sh@93 -- # local resv 00:03:12.229 15:18:58 -- setup/hugepages.sh@94 -- # local anon 00:03:12.229 15:18:58 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:12.229 15:18:58 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:12.229 15:18:58 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:12.229 15:18:58 -- setup/common.sh@18 -- # local node= 00:03:12.229 15:18:58 -- setup/common.sh@19 -- # local var val 00:03:12.229 15:18:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:12.229 15:18:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.229 15:18:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.229 15:18:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.229 15:18:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.229 15:18:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42999516 kB' 'MemAvailable: 46720312 kB' 'Buffers: 8940 kB' 'Cached: 11211568 kB' 'SwapCached: 0 kB' 'Active: 8002060 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584680 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473124 kB' 'Mapped: 172200 kB' 'Shmem: 7114904 kB' 'KReclaimable: 224568 kB' 'Slab: 921220 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696652 kB' 'KernelStack: 21808 kB' 'PageTables: 7672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8789708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.229 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.229 15:18:58 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.230 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.230 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.491 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.491 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.492 15:18:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:12.492 15:18:58 -- setup/common.sh@33 -- # echo 0 00:03:12.492 15:18:58 -- setup/common.sh@33 -- # return 0 00:03:12.492 15:18:58 -- setup/hugepages.sh@97 -- # anon=0 00:03:12.492 15:18:58 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:12.492 15:18:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:12.492 15:18:58 -- setup/common.sh@18 -- # local node= 00:03:12.492 15:18:58 -- setup/common.sh@19 -- # local var val 00:03:12.492 15:18:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:12.492 15:18:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.492 15:18:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.492 15:18:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.492 15:18:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.492 15:18:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.492 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43001056 kB' 'MemAvailable: 46721852 kB' 'Buffers: 8940 kB' 'Cached: 11211572 kB' 'SwapCached: 0 kB' 'Active: 8001628 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584248 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 472700 kB' 'Mapped: 172180 kB' 'Shmem: 7114908 kB' 'KReclaimable: 224568 kB' 'Slab: 921292 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696724 kB' 'KernelStack: 21760 kB' 'PageTables: 7544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8789720 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.493 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.493 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.494 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.494 15:18:58 -- setup/common.sh@33 -- # echo 0 00:03:12.494 15:18:58 -- setup/common.sh@33 -- # return 0 00:03:12.494 15:18:58 -- setup/hugepages.sh@99 -- # surp=0 00:03:12.494 15:18:58 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:12.494 15:18:58 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:12.494 15:18:58 -- setup/common.sh@18 -- # local node= 00:03:12.494 15:18:58 -- setup/common.sh@19 -- # local var val 00:03:12.494 15:18:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:12.494 15:18:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.494 15:18:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.494 15:18:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.494 15:18:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.494 15:18:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.494 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43001528 kB' 'MemAvailable: 46722324 kB' 'Buffers: 8940 kB' 'Cached: 11211584 kB' 'SwapCached: 0 kB' 'Active: 8001636 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584256 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 472684 kB' 'Mapped: 172180 kB' 'Shmem: 7114920 kB' 'KReclaimable: 224568 kB' 'Slab: 921292 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696724 kB' 'KernelStack: 21760 kB' 'PageTables: 7544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8789736 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.495 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.495 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.496 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:12.496 15:18:58 -- setup/common.sh@33 -- # echo 0 00:03:12.496 15:18:58 -- setup/common.sh@33 -- # return 0 00:03:12.496 15:18:58 -- setup/hugepages.sh@100 -- # resv=0 00:03:12.496 15:18:58 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:12.496 nr_hugepages=1024 00:03:12.496 15:18:58 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:12.496 resv_hugepages=0 00:03:12.496 15:18:58 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:12.496 surplus_hugepages=0 00:03:12.496 15:18:58 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:12.496 anon_hugepages=0 00:03:12.496 15:18:58 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:12.496 15:18:58 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:12.496 15:18:58 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:12.496 15:18:58 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:12.496 15:18:58 -- setup/common.sh@18 -- # local node= 00:03:12.496 15:18:58 -- setup/common.sh@19 -- # local var val 00:03:12.496 15:18:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:12.496 15:18:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.496 15:18:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:12.496 15:18:58 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:12.496 15:18:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.496 15:18:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.496 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43001276 kB' 'MemAvailable: 46722072 kB' 'Buffers: 8940 kB' 'Cached: 11211612 kB' 'SwapCached: 0 kB' 'Active: 8001948 kB' 'Inactive: 3688224 kB' 'Active(anon): 7584568 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 472988 kB' 'Mapped: 172180 kB' 'Shmem: 7114948 kB' 'KReclaimable: 224568 kB' 'Slab: 921292 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696724 kB' 'KernelStack: 21760 kB' 'PageTables: 7544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8789752 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.497 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.497 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.498 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.498 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:12.498 15:18:58 -- setup/common.sh@33 -- # echo 1024 00:03:12.498 15:18:58 -- setup/common.sh@33 -- # return 0 00:03:12.498 15:18:58 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:12.498 15:18:58 -- setup/hugepages.sh@112 -- # get_nodes 00:03:12.498 15:18:58 -- setup/hugepages.sh@27 -- # local node 00:03:12.498 15:18:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:12.498 15:18:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:12.498 15:18:58 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:12.498 15:18:58 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:12.499 15:18:58 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:12.499 15:18:58 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:12.499 15:18:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:12.499 15:18:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:12.499 15:18:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:12.499 15:18:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:12.499 15:18:58 -- setup/common.sh@18 -- # local node=0 00:03:12.499 15:18:58 -- setup/common.sh@19 -- # local var val 00:03:12.499 15:18:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:12.499 15:18:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.499 15:18:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:12.499 15:18:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:12.499 15:18:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.499 15:18:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 27405384 kB' 'MemUsed: 5229052 kB' 'SwapCached: 0 kB' 'Active: 3020340 kB' 'Inactive: 134724 kB' 'Active(anon): 2683960 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999380 kB' 'Mapped: 145352 kB' 'AnonPages: 158760 kB' 'Shmem: 2528276 kB' 'KernelStack: 12296 kB' 'PageTables: 4132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122404 kB' 'Slab: 458088 kB' 'SReclaimable: 122404 kB' 'SUnreclaim: 335684 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.499 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.499 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@33 -- # echo 0 00:03:12.500 15:18:58 -- setup/common.sh@33 -- # return 0 00:03:12.500 15:18:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:12.500 15:18:58 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:12.500 15:18:58 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:12.500 15:18:58 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:12.500 15:18:58 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:12.500 15:18:58 -- setup/common.sh@18 -- # local node=1 00:03:12.500 15:18:58 -- setup/common.sh@19 -- # local var val 00:03:12.500 15:18:58 -- setup/common.sh@20 -- # local mem_f mem 00:03:12.500 15:18:58 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:12.500 15:18:58 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:12.500 15:18:58 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:12.500 15:18:58 -- setup/common.sh@28 -- # mapfile -t mem 00:03:12.500 15:18:58 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649328 kB' 'MemFree: 15599080 kB' 'MemUsed: 12050248 kB' 'SwapCached: 0 kB' 'Active: 4981628 kB' 'Inactive: 3553500 kB' 'Active(anon): 4900628 kB' 'Inactive(anon): 0 kB' 'Active(file): 81000 kB' 'Inactive(file): 3553500 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8221184 kB' 'Mapped: 26828 kB' 'AnonPages: 314108 kB' 'Shmem: 4586684 kB' 'KernelStack: 9448 kB' 'PageTables: 3372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 102164 kB' 'Slab: 463204 kB' 'SReclaimable: 102164 kB' 'SUnreclaim: 361040 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.500 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.500 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # continue 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # IFS=': ' 00:03:12.501 15:18:58 -- setup/common.sh@31 -- # read -r var val _ 00:03:12.501 15:18:58 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:12.501 15:18:58 -- setup/common.sh@33 -- # echo 0 00:03:12.501 15:18:58 -- setup/common.sh@33 -- # return 0 00:03:12.501 15:18:58 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:12.501 15:18:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:12.501 15:18:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:12.501 15:18:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:12.501 15:18:58 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:12.501 node0=512 expecting 512 00:03:12.501 15:18:58 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:12.501 15:18:58 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:12.501 15:18:58 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:12.501 15:18:58 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:12.501 node1=512 expecting 512 00:03:12.501 15:18:58 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:12.501 00:03:12.501 real 0m3.540s 00:03:12.501 user 0m1.294s 00:03:12.501 sys 0m2.261s 00:03:12.501 15:18:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:12.501 15:18:58 -- common/autotest_common.sh@10 -- # set +x 00:03:12.501 ************************************ 00:03:12.501 END TEST even_2G_alloc 00:03:12.501 ************************************ 00:03:12.501 15:18:58 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:12.501 15:18:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:12.502 15:18:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:12.502 15:18:58 -- common/autotest_common.sh@10 -- # set +x 00:03:12.502 ************************************ 00:03:12.502 START TEST odd_alloc 00:03:12.502 ************************************ 00:03:12.502 15:18:58 -- common/autotest_common.sh@1114 -- # odd_alloc 00:03:12.502 15:18:58 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:12.502 15:18:58 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:12.502 15:18:58 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:12.502 15:18:58 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:12.502 15:18:58 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:12.502 15:18:58 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:12.502 15:18:58 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:12.502 15:18:58 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:12.502 15:18:58 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:12.502 15:18:58 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:12.502 15:18:58 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:12.502 15:18:58 -- setup/hugepages.sh@83 -- # : 513 00:03:12.502 15:18:58 -- setup/hugepages.sh@84 -- # : 1 00:03:12.502 15:18:58 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:12.502 15:18:58 -- setup/hugepages.sh@83 -- # : 0 00:03:12.502 15:18:58 -- setup/hugepages.sh@84 -- # : 0 00:03:12.502 15:18:58 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:12.502 15:18:58 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:12.502 15:18:58 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:12.502 15:18:58 -- setup/hugepages.sh@160 -- # setup output 00:03:12.502 15:18:58 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:12.502 15:18:58 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:16.689 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.689 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:16.689 15:19:01 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:16.689 15:19:01 -- setup/hugepages.sh@89 -- # local node 00:03:16.689 15:19:01 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:16.689 15:19:01 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:16.689 15:19:01 -- setup/hugepages.sh@92 -- # local surp 00:03:16.689 15:19:01 -- setup/hugepages.sh@93 -- # local resv 00:03:16.689 15:19:01 -- setup/hugepages.sh@94 -- # local anon 00:03:16.689 15:19:01 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:16.689 15:19:01 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:16.689 15:19:01 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:16.689 15:19:01 -- setup/common.sh@18 -- # local node= 00:03:16.689 15:19:01 -- setup/common.sh@19 -- # local var val 00:03:16.689 15:19:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.689 15:19:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.689 15:19:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.689 15:19:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.689 15:19:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.689 15:19:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.689 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.689 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43042388 kB' 'MemAvailable: 46763184 kB' 'Buffers: 8940 kB' 'Cached: 11211712 kB' 'SwapCached: 0 kB' 'Active: 8003816 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586436 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474700 kB' 'Mapped: 172192 kB' 'Shmem: 7115048 kB' 'KReclaimable: 224568 kB' 'Slab: 920856 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696288 kB' 'KernelStack: 21840 kB' 'PageTables: 7656 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480884 kB' 'Committed_AS: 8795032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.690 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.690 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.691 15:19:01 -- setup/common.sh@33 -- # echo 0 00:03:16.691 15:19:01 -- setup/common.sh@33 -- # return 0 00:03:16.691 15:19:01 -- setup/hugepages.sh@97 -- # anon=0 00:03:16.691 15:19:01 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:16.691 15:19:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.691 15:19:01 -- setup/common.sh@18 -- # local node= 00:03:16.691 15:19:01 -- setup/common.sh@19 -- # local var val 00:03:16.691 15:19:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.691 15:19:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.691 15:19:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.691 15:19:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.691 15:19:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.691 15:19:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43050228 kB' 'MemAvailable: 46771024 kB' 'Buffers: 8940 kB' 'Cached: 11211716 kB' 'SwapCached: 0 kB' 'Active: 8004288 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586908 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 475168 kB' 'Mapped: 172192 kB' 'Shmem: 7115052 kB' 'KReclaimable: 224568 kB' 'Slab: 920836 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696268 kB' 'KernelStack: 21888 kB' 'PageTables: 7572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480884 kB' 'Committed_AS: 8795204 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214496 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.691 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.691 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.692 15:19:01 -- setup/common.sh@33 -- # echo 0 00:03:16.692 15:19:01 -- setup/common.sh@33 -- # return 0 00:03:16.692 15:19:01 -- setup/hugepages.sh@99 -- # surp=0 00:03:16.692 15:19:01 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:16.692 15:19:01 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:16.692 15:19:01 -- setup/common.sh@18 -- # local node= 00:03:16.692 15:19:01 -- setup/common.sh@19 -- # local var val 00:03:16.692 15:19:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.692 15:19:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.692 15:19:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.692 15:19:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.692 15:19:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.692 15:19:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43049848 kB' 'MemAvailable: 46770644 kB' 'Buffers: 8940 kB' 'Cached: 11211728 kB' 'SwapCached: 0 kB' 'Active: 8003928 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586548 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474768 kB' 'Mapped: 172192 kB' 'Shmem: 7115064 kB' 'KReclaimable: 224568 kB' 'Slab: 920892 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696324 kB' 'KernelStack: 21984 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480884 kB' 'Committed_AS: 8795216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214448 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.692 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.692 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.693 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.693 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.694 15:19:01 -- setup/common.sh@33 -- # echo 0 00:03:16.694 15:19:01 -- setup/common.sh@33 -- # return 0 00:03:16.694 15:19:01 -- setup/hugepages.sh@100 -- # resv=0 00:03:16.694 15:19:01 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:16.694 nr_hugepages=1025 00:03:16.694 15:19:01 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:16.694 resv_hugepages=0 00:03:16.694 15:19:01 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:16.694 surplus_hugepages=0 00:03:16.694 15:19:01 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:16.694 anon_hugepages=0 00:03:16.694 15:19:01 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:16.694 15:19:01 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:16.694 15:19:01 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:16.694 15:19:01 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:16.694 15:19:01 -- setup/common.sh@18 -- # local node= 00:03:16.694 15:19:01 -- setup/common.sh@19 -- # local var val 00:03:16.694 15:19:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.694 15:19:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.694 15:19:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.694 15:19:01 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.694 15:19:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.694 15:19:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43048528 kB' 'MemAvailable: 46769324 kB' 'Buffers: 8940 kB' 'Cached: 11211744 kB' 'SwapCached: 0 kB' 'Active: 8003412 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586032 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474256 kB' 'Mapped: 172192 kB' 'Shmem: 7115080 kB' 'KReclaimable: 224568 kB' 'Slab: 920892 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696324 kB' 'KernelStack: 21728 kB' 'PageTables: 7420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480884 kB' 'Committed_AS: 8790684 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.694 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.694 15:19:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.695 15:19:01 -- setup/common.sh@33 -- # echo 1025 00:03:16.695 15:19:01 -- setup/common.sh@33 -- # return 0 00:03:16.695 15:19:01 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:16.695 15:19:01 -- setup/hugepages.sh@112 -- # get_nodes 00:03:16.695 15:19:01 -- setup/hugepages.sh@27 -- # local node 00:03:16.695 15:19:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.695 15:19:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:16.695 15:19:01 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.695 15:19:01 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:16.695 15:19:01 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:16.695 15:19:01 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:16.695 15:19:01 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:16.695 15:19:01 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:16.695 15:19:01 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:16.695 15:19:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.695 15:19:01 -- setup/common.sh@18 -- # local node=0 00:03:16.695 15:19:01 -- setup/common.sh@19 -- # local var val 00:03:16.695 15:19:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.695 15:19:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.695 15:19:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:16.695 15:19:01 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:16.695 15:19:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.695 15:19:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 27414888 kB' 'MemUsed: 5219548 kB' 'SwapCached: 0 kB' 'Active: 3021580 kB' 'Inactive: 134724 kB' 'Active(anon): 2685200 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999428 kB' 'Mapped: 145364 kB' 'AnonPages: 160060 kB' 'Shmem: 2528324 kB' 'KernelStack: 12248 kB' 'PageTables: 3980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122404 kB' 'Slab: 458068 kB' 'SReclaimable: 122404 kB' 'SUnreclaim: 335664 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.695 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.695 15:19:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@33 -- # echo 0 00:03:16.696 15:19:01 -- setup/common.sh@33 -- # return 0 00:03:16.696 15:19:01 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:16.696 15:19:01 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:16.696 15:19:01 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:16.696 15:19:01 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:16.696 15:19:01 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.696 15:19:01 -- setup/common.sh@18 -- # local node=1 00:03:16.696 15:19:01 -- setup/common.sh@19 -- # local var val 00:03:16.696 15:19:01 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.696 15:19:01 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.696 15:19:01 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:16.696 15:19:01 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:16.696 15:19:01 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.696 15:19:01 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.696 15:19:01 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649328 kB' 'MemFree: 15632128 kB' 'MemUsed: 12017200 kB' 'SwapCached: 0 kB' 'Active: 4981068 kB' 'Inactive: 3553500 kB' 'Active(anon): 4900068 kB' 'Inactive(anon): 0 kB' 'Active(file): 81000 kB' 'Inactive(file): 3553500 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8221292 kB' 'Mapped: 26828 kB' 'AnonPages: 313400 kB' 'Shmem: 4586792 kB' 'KernelStack: 9432 kB' 'PageTables: 3312 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 102164 kB' 'Slab: 462824 kB' 'SReclaimable: 102164 kB' 'SUnreclaim: 360660 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.696 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.696 15:19:01 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # continue 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.697 15:19:01 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.697 15:19:01 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.697 15:19:01 -- setup/common.sh@33 -- # echo 0 00:03:16.697 15:19:01 -- setup/common.sh@33 -- # return 0 00:03:16.697 15:19:01 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:16.697 15:19:01 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:16.697 15:19:01 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:16.697 15:19:01 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:16.697 15:19:01 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:16.697 node0=512 expecting 513 00:03:16.697 15:19:01 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:16.697 15:19:01 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:16.697 15:19:01 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:16.697 15:19:01 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:16.697 node1=513 expecting 512 00:03:16.697 15:19:01 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:16.697 00:03:16.697 real 0m3.698s 00:03:16.697 user 0m1.396s 00:03:16.697 sys 0m2.376s 00:03:16.697 15:19:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:16.697 15:19:01 -- common/autotest_common.sh@10 -- # set +x 00:03:16.697 ************************************ 00:03:16.697 END TEST odd_alloc 00:03:16.697 ************************************ 00:03:16.697 15:19:02 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:16.697 15:19:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:16.698 15:19:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:16.698 15:19:02 -- common/autotest_common.sh@10 -- # set +x 00:03:16.698 ************************************ 00:03:16.698 START TEST custom_alloc 00:03:16.698 ************************************ 00:03:16.698 15:19:02 -- common/autotest_common.sh@1114 -- # custom_alloc 00:03:16.698 15:19:02 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:16.698 15:19:02 -- setup/hugepages.sh@169 -- # local node 00:03:16.698 15:19:02 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:16.698 15:19:02 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:16.698 15:19:02 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:16.698 15:19:02 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:16.698 15:19:02 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:16.698 15:19:02 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:16.698 15:19:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:16.698 15:19:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:16.698 15:19:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:16.698 15:19:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:16.698 15:19:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:16.698 15:19:02 -- setup/hugepages.sh@83 -- # : 256 00:03:16.698 15:19:02 -- setup/hugepages.sh@84 -- # : 1 00:03:16.698 15:19:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:16.698 15:19:02 -- setup/hugepages.sh@83 -- # : 0 00:03:16.698 15:19:02 -- setup/hugepages.sh@84 -- # : 0 00:03:16.698 15:19:02 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:16.698 15:19:02 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:16.698 15:19:02 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:16.698 15:19:02 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:16.698 15:19:02 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:16.698 15:19:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:16.698 15:19:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:16.698 15:19:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:16.698 15:19:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:16.698 15:19:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:16.698 15:19:02 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:16.698 15:19:02 -- setup/hugepages.sh@78 -- # return 0 00:03:16.698 15:19:02 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:16.698 15:19:02 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:16.698 15:19:02 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:16.698 15:19:02 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:16.698 15:19:02 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:16.698 15:19:02 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:16.698 15:19:02 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:16.698 15:19:02 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:16.698 15:19:02 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:16.698 15:19:02 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:16.698 15:19:02 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:16.698 15:19:02 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:16.698 15:19:02 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:16.698 15:19:02 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:16.698 15:19:02 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:16.698 15:19:02 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:16.698 15:19:02 -- setup/hugepages.sh@78 -- # return 0 00:03:16.698 15:19:02 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:16.698 15:19:02 -- setup/hugepages.sh@187 -- # setup output 00:03:16.698 15:19:02 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:16.698 15:19:02 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:19.995 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:19.995 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:19.995 15:19:05 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:19.995 15:19:05 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:19.995 15:19:05 -- setup/hugepages.sh@89 -- # local node 00:03:19.995 15:19:05 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:19.995 15:19:05 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:19.995 15:19:05 -- setup/hugepages.sh@92 -- # local surp 00:03:19.995 15:19:05 -- setup/hugepages.sh@93 -- # local resv 00:03:19.995 15:19:05 -- setup/hugepages.sh@94 -- # local anon 00:03:19.995 15:19:05 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:19.995 15:19:05 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:19.995 15:19:05 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:19.995 15:19:05 -- setup/common.sh@18 -- # local node= 00:03:19.995 15:19:05 -- setup/common.sh@19 -- # local var val 00:03:19.995 15:19:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.995 15:19:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.995 15:19:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.995 15:19:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.995 15:19:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.995 15:19:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.995 15:19:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42013504 kB' 'MemAvailable: 45734300 kB' 'Buffers: 8940 kB' 'Cached: 11211844 kB' 'SwapCached: 0 kB' 'Active: 8004212 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586832 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474412 kB' 'Mapped: 172284 kB' 'Shmem: 7115180 kB' 'KReclaimable: 224568 kB' 'Slab: 921464 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696896 kB' 'KernelStack: 21760 kB' 'PageTables: 7576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957620 kB' 'Committed_AS: 8791296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.995 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.995 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.996 15:19:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.996 15:19:05 -- setup/common.sh@33 -- # echo 0 00:03:19.996 15:19:05 -- setup/common.sh@33 -- # return 0 00:03:19.996 15:19:05 -- setup/hugepages.sh@97 -- # anon=0 00:03:19.996 15:19:05 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:19.996 15:19:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:19.996 15:19:05 -- setup/common.sh@18 -- # local node= 00:03:19.996 15:19:05 -- setup/common.sh@19 -- # local var val 00:03:19.996 15:19:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.996 15:19:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.996 15:19:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.996 15:19:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.996 15:19:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.996 15:19:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.996 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42013412 kB' 'MemAvailable: 45734208 kB' 'Buffers: 8940 kB' 'Cached: 11211848 kB' 'SwapCached: 0 kB' 'Active: 8003452 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586072 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474152 kB' 'Mapped: 172196 kB' 'Shmem: 7115184 kB' 'KReclaimable: 224568 kB' 'Slab: 921464 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696896 kB' 'KernelStack: 21760 kB' 'PageTables: 7552 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957620 kB' 'Committed_AS: 8791308 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.997 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.997 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.998 15:19:05 -- setup/common.sh@33 -- # echo 0 00:03:19.998 15:19:05 -- setup/common.sh@33 -- # return 0 00:03:19.998 15:19:05 -- setup/hugepages.sh@99 -- # surp=0 00:03:19.998 15:19:05 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:19.998 15:19:05 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:19.998 15:19:05 -- setup/common.sh@18 -- # local node= 00:03:19.998 15:19:05 -- setup/common.sh@19 -- # local var val 00:03:19.998 15:19:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.998 15:19:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.998 15:19:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.998 15:19:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.998 15:19:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.998 15:19:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42013452 kB' 'MemAvailable: 45734248 kB' 'Buffers: 8940 kB' 'Cached: 11211848 kB' 'SwapCached: 0 kB' 'Active: 8003488 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586108 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474188 kB' 'Mapped: 172196 kB' 'Shmem: 7115184 kB' 'KReclaimable: 224568 kB' 'Slab: 921464 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696896 kB' 'KernelStack: 21776 kB' 'PageTables: 7604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957620 kB' 'Committed_AS: 8791324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214272 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.998 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.998 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # continue 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.999 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.999 15:19:05 -- setup/common.sh@33 -- # echo 0 00:03:19.999 15:19:05 -- setup/common.sh@33 -- # return 0 00:03:19.999 15:19:05 -- setup/hugepages.sh@100 -- # resv=0 00:03:19.999 15:19:05 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:19.999 nr_hugepages=1536 00:03:19.999 15:19:05 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:19.999 resv_hugepages=0 00:03:19.999 15:19:05 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:19.999 surplus_hugepages=0 00:03:19.999 15:19:05 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:19.999 anon_hugepages=0 00:03:19.999 15:19:05 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:19.999 15:19:05 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:19.999 15:19:05 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:19.999 15:19:05 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:19.999 15:19:05 -- setup/common.sh@18 -- # local node= 00:03:19.999 15:19:05 -- setup/common.sh@19 -- # local var val 00:03:19.999 15:19:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.999 15:19:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.999 15:19:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.999 15:19:05 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.999 15:19:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.999 15:19:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.999 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 42014060 kB' 'MemAvailable: 45734856 kB' 'Buffers: 8940 kB' 'Cached: 11211876 kB' 'SwapCached: 0 kB' 'Active: 8004848 kB' 'Inactive: 3688224 kB' 'Active(anon): 7587468 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 475548 kB' 'Mapped: 172696 kB' 'Shmem: 7115212 kB' 'KReclaimable: 224568 kB' 'Slab: 921464 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696896 kB' 'KernelStack: 21744 kB' 'PageTables: 7500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957620 kB' 'Committed_AS: 8793884 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.000 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.000 15:19:05 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:20.001 15:19:05 -- setup/common.sh@33 -- # echo 1536 00:03:20.001 15:19:05 -- setup/common.sh@33 -- # return 0 00:03:20.001 15:19:05 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:20.001 15:19:05 -- setup/hugepages.sh@112 -- # get_nodes 00:03:20.001 15:19:05 -- setup/hugepages.sh@27 -- # local node 00:03:20.001 15:19:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:20.001 15:19:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:20.001 15:19:05 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:20.001 15:19:05 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:20.001 15:19:05 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:20.001 15:19:05 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:20.001 15:19:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:20.001 15:19:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:20.001 15:19:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:20.001 15:19:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:20.001 15:19:05 -- setup/common.sh@18 -- # local node=0 00:03:20.001 15:19:05 -- setup/common.sh@19 -- # local var val 00:03:20.001 15:19:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:20.001 15:19:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:20.001 15:19:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:20.001 15:19:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:20.001 15:19:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:20.001 15:19:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 27414392 kB' 'MemUsed: 5220044 kB' 'SwapCached: 0 kB' 'Active: 3021176 kB' 'Inactive: 134724 kB' 'Active(anon): 2684796 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999436 kB' 'Mapped: 145776 kB' 'AnonPages: 159580 kB' 'Shmem: 2528332 kB' 'KernelStack: 12296 kB' 'PageTables: 4152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122404 kB' 'Slab: 458348 kB' 'SReclaimable: 122404 kB' 'SUnreclaim: 335944 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.001 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.001 15:19:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@33 -- # echo 0 00:03:20.002 15:19:05 -- setup/common.sh@33 -- # return 0 00:03:20.002 15:19:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:20.002 15:19:05 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:20.002 15:19:05 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:20.002 15:19:05 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:20.002 15:19:05 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:20.002 15:19:05 -- setup/common.sh@18 -- # local node=1 00:03:20.002 15:19:05 -- setup/common.sh@19 -- # local var val 00:03:20.002 15:19:05 -- setup/common.sh@20 -- # local mem_f mem 00:03:20.002 15:19:05 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:20.002 15:19:05 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:20.002 15:19:05 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:20.002 15:19:05 -- setup/common.sh@28 -- # mapfile -t mem 00:03:20.002 15:19:05 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649328 kB' 'MemFree: 14597132 kB' 'MemUsed: 13052196 kB' 'SwapCached: 0 kB' 'Active: 4982520 kB' 'Inactive: 3553500 kB' 'Active(anon): 4901520 kB' 'Inactive(anon): 0 kB' 'Active(file): 81000 kB' 'Inactive(file): 3553500 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8221404 kB' 'Mapped: 26828 kB' 'AnonPages: 314780 kB' 'Shmem: 4586904 kB' 'KernelStack: 9448 kB' 'PageTables: 3364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 102164 kB' 'Slab: 463100 kB' 'SReclaimable: 102164 kB' 'SUnreclaim: 360936 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.002 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.002 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # continue 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # IFS=': ' 00:03:20.003 15:19:05 -- setup/common.sh@31 -- # read -r var val _ 00:03:20.003 15:19:05 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:20.003 15:19:05 -- setup/common.sh@33 -- # echo 0 00:03:20.003 15:19:05 -- setup/common.sh@33 -- # return 0 00:03:20.003 15:19:05 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:20.003 15:19:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:20.003 15:19:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:20.003 15:19:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:20.003 15:19:05 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:20.003 node0=512 expecting 512 00:03:20.003 15:19:05 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:20.003 15:19:05 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:20.003 15:19:05 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:20.003 15:19:05 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:20.003 node1=1024 expecting 1024 00:03:20.003 15:19:05 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:20.003 00:03:20.003 real 0m3.471s 00:03:20.003 user 0m1.308s 00:03:20.003 sys 0m2.201s 00:03:20.003 15:19:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:20.003 15:19:05 -- common/autotest_common.sh@10 -- # set +x 00:03:20.003 ************************************ 00:03:20.003 END TEST custom_alloc 00:03:20.003 ************************************ 00:03:20.003 15:19:05 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:20.003 15:19:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:20.003 15:19:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:20.003 15:19:05 -- common/autotest_common.sh@10 -- # set +x 00:03:20.003 ************************************ 00:03:20.003 START TEST no_shrink_alloc 00:03:20.003 ************************************ 00:03:20.003 15:19:05 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:03:20.003 15:19:05 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:20.003 15:19:05 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:20.003 15:19:05 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:20.003 15:19:05 -- setup/hugepages.sh@51 -- # shift 00:03:20.003 15:19:05 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:20.003 15:19:05 -- setup/hugepages.sh@52 -- # local node_ids 00:03:20.003 15:19:05 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:20.003 15:19:05 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:20.003 15:19:05 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:20.003 15:19:05 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:20.003 15:19:05 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:20.003 15:19:05 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:20.003 15:19:05 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:20.003 15:19:05 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:20.003 15:19:05 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:20.003 15:19:05 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:20.003 15:19:05 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:20.003 15:19:05 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:20.003 15:19:05 -- setup/hugepages.sh@73 -- # return 0 00:03:20.003 15:19:05 -- setup/hugepages.sh@198 -- # setup output 00:03:20.003 15:19:05 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:20.003 15:19:05 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:23.300 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:23.300 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:23.300 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:23.301 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:23.301 15:19:08 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:23.301 15:19:08 -- setup/hugepages.sh@89 -- # local node 00:03:23.301 15:19:08 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:23.301 15:19:08 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:23.301 15:19:08 -- setup/hugepages.sh@92 -- # local surp 00:03:23.301 15:19:08 -- setup/hugepages.sh@93 -- # local resv 00:03:23.301 15:19:08 -- setup/hugepages.sh@94 -- # local anon 00:03:23.301 15:19:08 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:23.301 15:19:08 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:23.301 15:19:08 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:23.301 15:19:08 -- setup/common.sh@18 -- # local node= 00:03:23.301 15:19:08 -- setup/common.sh@19 -- # local var val 00:03:23.301 15:19:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:23.301 15:19:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:23.301 15:19:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:23.301 15:19:08 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:23.301 15:19:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:23.301 15:19:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43052780 kB' 'MemAvailable: 46773576 kB' 'Buffers: 8940 kB' 'Cached: 11211964 kB' 'SwapCached: 0 kB' 'Active: 8003960 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586580 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 473936 kB' 'Mapped: 172304 kB' 'Shmem: 7115300 kB' 'KReclaimable: 224568 kB' 'Slab: 920788 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696220 kB' 'KernelStack: 21728 kB' 'PageTables: 7408 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8791440 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.301 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.301 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:23.302 15:19:08 -- setup/common.sh@33 -- # echo 0 00:03:23.302 15:19:08 -- setup/common.sh@33 -- # return 0 00:03:23.302 15:19:08 -- setup/hugepages.sh@97 -- # anon=0 00:03:23.302 15:19:08 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:23.302 15:19:08 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:23.302 15:19:08 -- setup/common.sh@18 -- # local node= 00:03:23.302 15:19:08 -- setup/common.sh@19 -- # local var val 00:03:23.302 15:19:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:23.302 15:19:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:23.302 15:19:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:23.302 15:19:08 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:23.302 15:19:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:23.302 15:19:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43053324 kB' 'MemAvailable: 46774120 kB' 'Buffers: 8940 kB' 'Cached: 11211976 kB' 'SwapCached: 0 kB' 'Active: 8004580 kB' 'Inactive: 3688224 kB' 'Active(anon): 7587200 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 475100 kB' 'Mapped: 172200 kB' 'Shmem: 7115312 kB' 'KReclaimable: 224568 kB' 'Slab: 920676 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696108 kB' 'KernelStack: 21776 kB' 'PageTables: 7624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8791744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.302 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.302 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.303 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.303 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.303 15:19:08 -- setup/common.sh@33 -- # echo 0 00:03:23.303 15:19:08 -- setup/common.sh@33 -- # return 0 00:03:23.303 15:19:08 -- setup/hugepages.sh@99 -- # surp=0 00:03:23.304 15:19:08 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:23.304 15:19:08 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:23.304 15:19:08 -- setup/common.sh@18 -- # local node= 00:03:23.304 15:19:08 -- setup/common.sh@19 -- # local var val 00:03:23.304 15:19:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:23.304 15:19:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:23.304 15:19:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:23.304 15:19:08 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:23.304 15:19:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:23.304 15:19:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43053576 kB' 'MemAvailable: 46774372 kB' 'Buffers: 8940 kB' 'Cached: 11211984 kB' 'SwapCached: 0 kB' 'Active: 8004236 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586856 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474768 kB' 'Mapped: 172200 kB' 'Shmem: 7115320 kB' 'KReclaimable: 224568 kB' 'Slab: 920676 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696108 kB' 'KernelStack: 21776 kB' 'PageTables: 7600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8791964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.304 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.304 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:23.305 15:19:08 -- setup/common.sh@33 -- # echo 0 00:03:23.305 15:19:08 -- setup/common.sh@33 -- # return 0 00:03:23.305 15:19:08 -- setup/hugepages.sh@100 -- # resv=0 00:03:23.305 15:19:08 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:23.305 nr_hugepages=1024 00:03:23.305 15:19:08 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:23.305 resv_hugepages=0 00:03:23.305 15:19:08 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:23.305 surplus_hugepages=0 00:03:23.305 15:19:08 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:23.305 anon_hugepages=0 00:03:23.305 15:19:08 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:23.305 15:19:08 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:23.305 15:19:08 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:23.305 15:19:08 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:23.305 15:19:08 -- setup/common.sh@18 -- # local node= 00:03:23.305 15:19:08 -- setup/common.sh@19 -- # local var val 00:03:23.305 15:19:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:23.305 15:19:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:23.305 15:19:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:23.305 15:19:08 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:23.305 15:19:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:23.305 15:19:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:23.305 15:19:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43053664 kB' 'MemAvailable: 46774460 kB' 'Buffers: 8940 kB' 'Cached: 11212012 kB' 'SwapCached: 0 kB' 'Active: 8003888 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586508 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474344 kB' 'Mapped: 172200 kB' 'Shmem: 7115348 kB' 'KReclaimable: 224568 kB' 'Slab: 920676 kB' 'SReclaimable: 224568 kB' 'SUnreclaim: 696108 kB' 'KernelStack: 21760 kB' 'PageTables: 7544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8791980 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.305 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.305 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.306 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.306 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:23.307 15:19:08 -- setup/common.sh@33 -- # echo 1024 00:03:23.307 15:19:08 -- setup/common.sh@33 -- # return 0 00:03:23.307 15:19:08 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:23.307 15:19:08 -- setup/hugepages.sh@112 -- # get_nodes 00:03:23.307 15:19:08 -- setup/hugepages.sh@27 -- # local node 00:03:23.307 15:19:08 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:23.307 15:19:08 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:23.307 15:19:08 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:23.307 15:19:08 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:23.307 15:19:08 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:23.307 15:19:08 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:23.307 15:19:08 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:23.307 15:19:08 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:23.307 15:19:08 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:23.307 15:19:08 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:23.307 15:19:08 -- setup/common.sh@18 -- # local node=0 00:03:23.307 15:19:08 -- setup/common.sh@19 -- # local var val 00:03:23.307 15:19:08 -- setup/common.sh@20 -- # local mem_f mem 00:03:23.307 15:19:08 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:23.307 15:19:08 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:23.307 15:19:08 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:23.307 15:19:08 -- setup/common.sh@28 -- # mapfile -t mem 00:03:23.307 15:19:08 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26366128 kB' 'MemUsed: 6268308 kB' 'SwapCached: 0 kB' 'Active: 3020816 kB' 'Inactive: 134724 kB' 'Active(anon): 2684436 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999436 kB' 'Mapped: 145372 kB' 'AnonPages: 159176 kB' 'Shmem: 2528332 kB' 'KernelStack: 12296 kB' 'PageTables: 4128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122404 kB' 'Slab: 457388 kB' 'SReclaimable: 122404 kB' 'SUnreclaim: 334984 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.307 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.307 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # continue 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # IFS=': ' 00:03:23.308 15:19:08 -- setup/common.sh@31 -- # read -r var val _ 00:03:23.308 15:19:08 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:23.308 15:19:08 -- setup/common.sh@33 -- # echo 0 00:03:23.308 15:19:08 -- setup/common.sh@33 -- # return 0 00:03:23.308 15:19:08 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:23.308 15:19:08 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:23.308 15:19:08 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:23.308 15:19:08 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:23.308 15:19:08 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:23.308 node0=1024 expecting 1024 00:03:23.308 15:19:08 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:23.308 15:19:08 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:23.308 15:19:08 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:23.308 15:19:08 -- setup/hugepages.sh@202 -- # setup output 00:03:23.308 15:19:08 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:23.308 15:19:08 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:26.604 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:26.604 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:26.604 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:26.604 15:19:11 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:26.604 15:19:11 -- setup/hugepages.sh@89 -- # local node 00:03:26.604 15:19:11 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:26.604 15:19:11 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:26.604 15:19:11 -- setup/hugepages.sh@92 -- # local surp 00:03:26.604 15:19:11 -- setup/hugepages.sh@93 -- # local resv 00:03:26.604 15:19:11 -- setup/hugepages.sh@94 -- # local anon 00:03:26.604 15:19:11 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:26.604 15:19:11 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:26.604 15:19:11 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:26.604 15:19:11 -- setup/common.sh@18 -- # local node= 00:03:26.604 15:19:11 -- setup/common.sh@19 -- # local var val 00:03:26.604 15:19:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:26.604 15:19:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:26.604 15:19:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:26.604 15:19:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:26.604 15:19:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:26.604 15:19:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43057640 kB' 'MemAvailable: 46778420 kB' 'Buffers: 8940 kB' 'Cached: 11212072 kB' 'SwapCached: 0 kB' 'Active: 8004580 kB' 'Inactive: 3688224 kB' 'Active(anon): 7587200 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474636 kB' 'Mapped: 172304 kB' 'Shmem: 7115408 kB' 'KReclaimable: 224536 kB' 'Slab: 919964 kB' 'SReclaimable: 224536 kB' 'SUnreclaim: 695428 kB' 'KernelStack: 21792 kB' 'PageTables: 7592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8791940 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.604 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.604 15:19:11 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.605 15:19:11 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:26.605 15:19:11 -- setup/common.sh@33 -- # echo 0 00:03:26.605 15:19:11 -- setup/common.sh@33 -- # return 0 00:03:26.605 15:19:11 -- setup/hugepages.sh@97 -- # anon=0 00:03:26.605 15:19:11 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:26.605 15:19:11 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:26.605 15:19:11 -- setup/common.sh@18 -- # local node= 00:03:26.605 15:19:11 -- setup/common.sh@19 -- # local var val 00:03:26.605 15:19:11 -- setup/common.sh@20 -- # local mem_f mem 00:03:26.605 15:19:11 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:26.605 15:19:11 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:26.605 15:19:11 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:26.605 15:19:11 -- setup/common.sh@28 -- # mapfile -t mem 00:03:26.605 15:19:11 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.605 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43061400 kB' 'MemAvailable: 46782180 kB' 'Buffers: 8940 kB' 'Cached: 11212080 kB' 'SwapCached: 0 kB' 'Active: 8004168 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586788 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474636 kB' 'Mapped: 172204 kB' 'Shmem: 7115416 kB' 'KReclaimable: 224536 kB' 'Slab: 919972 kB' 'SReclaimable: 224536 kB' 'SUnreclaim: 695436 kB' 'KernelStack: 21728 kB' 'PageTables: 7368 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8791956 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:11 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:11 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.606 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.606 15:19:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.607 15:19:12 -- setup/common.sh@33 -- # echo 0 00:03:26.607 15:19:12 -- setup/common.sh@33 -- # return 0 00:03:26.607 15:19:12 -- setup/hugepages.sh@99 -- # surp=0 00:03:26.607 15:19:12 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:26.607 15:19:12 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:26.607 15:19:12 -- setup/common.sh@18 -- # local node= 00:03:26.607 15:19:12 -- setup/common.sh@19 -- # local var val 00:03:26.607 15:19:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:26.607 15:19:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:26.607 15:19:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:26.607 15:19:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:26.607 15:19:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:26.607 15:19:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43062720 kB' 'MemAvailable: 46783500 kB' 'Buffers: 8940 kB' 'Cached: 11212088 kB' 'SwapCached: 0 kB' 'Active: 8003888 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586508 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474352 kB' 'Mapped: 172204 kB' 'Shmem: 7115424 kB' 'KReclaimable: 224536 kB' 'Slab: 919908 kB' 'SReclaimable: 224536 kB' 'SUnreclaim: 695372 kB' 'KernelStack: 21744 kB' 'PageTables: 7436 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8792100 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214240 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.607 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.607 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.608 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.608 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:26.608 15:19:12 -- setup/common.sh@33 -- # echo 0 00:03:26.608 15:19:12 -- setup/common.sh@33 -- # return 0 00:03:26.608 15:19:12 -- setup/hugepages.sh@100 -- # resv=0 00:03:26.608 15:19:12 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:26.608 nr_hugepages=1024 00:03:26.608 15:19:12 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:26.608 resv_hugepages=0 00:03:26.608 15:19:12 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:26.608 surplus_hugepages=0 00:03:26.608 15:19:12 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:26.608 anon_hugepages=0 00:03:26.608 15:19:12 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:26.608 15:19:12 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:26.609 15:19:12 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:26.609 15:19:12 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:26.609 15:19:12 -- setup/common.sh@18 -- # local node= 00:03:26.609 15:19:12 -- setup/common.sh@19 -- # local var val 00:03:26.609 15:19:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:26.609 15:19:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:26.609 15:19:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:26.609 15:19:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:26.609 15:19:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:26.609 15:19:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283764 kB' 'MemFree: 43062516 kB' 'MemAvailable: 46783296 kB' 'Buffers: 8940 kB' 'Cached: 11212112 kB' 'SwapCached: 0 kB' 'Active: 8004196 kB' 'Inactive: 3688224 kB' 'Active(anon): 7586816 kB' 'Inactive(anon): 0 kB' 'Active(file): 417380 kB' 'Inactive(file): 3688224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 474720 kB' 'Mapped: 172204 kB' 'Shmem: 7115448 kB' 'KReclaimable: 224536 kB' 'Slab: 919908 kB' 'SReclaimable: 224536 kB' 'SUnreclaim: 695372 kB' 'KernelStack: 21760 kB' 'PageTables: 7496 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481908 kB' 'Committed_AS: 8792120 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 74816 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 540020 kB' 'DirectMap2M: 10680320 kB' 'DirectMap1G: 58720256 kB' 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.609 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.609 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:26.610 15:19:12 -- setup/common.sh@33 -- # echo 1024 00:03:26.610 15:19:12 -- setup/common.sh@33 -- # return 0 00:03:26.610 15:19:12 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:26.610 15:19:12 -- setup/hugepages.sh@112 -- # get_nodes 00:03:26.610 15:19:12 -- setup/hugepages.sh@27 -- # local node 00:03:26.610 15:19:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:26.610 15:19:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:26.610 15:19:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:26.610 15:19:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:26.610 15:19:12 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:26.610 15:19:12 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:26.610 15:19:12 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:26.610 15:19:12 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:26.610 15:19:12 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:26.610 15:19:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:26.610 15:19:12 -- setup/common.sh@18 -- # local node=0 00:03:26.610 15:19:12 -- setup/common.sh@19 -- # local var val 00:03:26.610 15:19:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:26.610 15:19:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:26.610 15:19:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:26.610 15:19:12 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:26.610 15:19:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:26.610 15:19:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 26372552 kB' 'MemUsed: 6261884 kB' 'SwapCached: 0 kB' 'Active: 3020124 kB' 'Inactive: 134724 kB' 'Active(anon): 2683744 kB' 'Inactive(anon): 0 kB' 'Active(file): 336380 kB' 'Inactive(file): 134724 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2999460 kB' 'Mapped: 145376 kB' 'AnonPages: 158608 kB' 'Shmem: 2528356 kB' 'KernelStack: 12312 kB' 'PageTables: 4128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 122372 kB' 'Slab: 457152 kB' 'SReclaimable: 122372 kB' 'SUnreclaim: 334780 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.610 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.610 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # continue 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:26.611 15:19:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:26.611 15:19:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:26.611 15:19:12 -- setup/common.sh@33 -- # echo 0 00:03:26.611 15:19:12 -- setup/common.sh@33 -- # return 0 00:03:26.611 15:19:12 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:26.611 15:19:12 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:26.611 15:19:12 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:26.611 15:19:12 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:26.611 15:19:12 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:26.611 node0=1024 expecting 1024 00:03:26.611 15:19:12 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:26.611 00:03:26.611 real 0m6.588s 00:03:26.611 user 0m2.426s 00:03:26.611 sys 0m4.149s 00:03:26.611 15:19:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:26.611 15:19:12 -- common/autotest_common.sh@10 -- # set +x 00:03:26.611 ************************************ 00:03:26.611 END TEST no_shrink_alloc 00:03:26.611 ************************************ 00:03:26.611 15:19:12 -- setup/hugepages.sh@217 -- # clear_hp 00:03:26.611 15:19:12 -- setup/hugepages.sh@37 -- # local node hp 00:03:26.611 15:19:12 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:26.611 15:19:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:26.611 15:19:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:26.611 15:19:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:26.611 15:19:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:26.611 15:19:12 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:26.611 15:19:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:26.611 15:19:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:26.611 15:19:12 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:26.611 15:19:12 -- setup/hugepages.sh@41 -- # echo 0 00:03:26.611 15:19:12 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:26.611 15:19:12 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:26.611 00:03:26.611 real 0m26.734s 00:03:26.611 user 0m9.380s 00:03:26.611 sys 0m16.083s 00:03:26.611 15:19:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:26.612 15:19:12 -- common/autotest_common.sh@10 -- # set +x 00:03:26.612 ************************************ 00:03:26.612 END TEST hugepages 00:03:26.612 ************************************ 00:03:26.612 15:19:12 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:26.612 15:19:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:26.612 15:19:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:26.612 15:19:12 -- common/autotest_common.sh@10 -- # set +x 00:03:26.612 ************************************ 00:03:26.612 START TEST driver 00:03:26.612 ************************************ 00:03:26.612 15:19:12 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:26.612 * Looking for test storage... 00:03:26.612 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:26.612 15:19:12 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:26.612 15:19:12 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:26.612 15:19:12 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:26.612 15:19:12 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:26.612 15:19:12 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:26.612 15:19:12 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:26.612 15:19:12 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:26.612 15:19:12 -- scripts/common.sh@335 -- # IFS=.-: 00:03:26.612 15:19:12 -- scripts/common.sh@335 -- # read -ra ver1 00:03:26.612 15:19:12 -- scripts/common.sh@336 -- # IFS=.-: 00:03:26.612 15:19:12 -- scripts/common.sh@336 -- # read -ra ver2 00:03:26.612 15:19:12 -- scripts/common.sh@337 -- # local 'op=<' 00:03:26.612 15:19:12 -- scripts/common.sh@339 -- # ver1_l=2 00:03:26.612 15:19:12 -- scripts/common.sh@340 -- # ver2_l=1 00:03:26.612 15:19:12 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:26.612 15:19:12 -- scripts/common.sh@343 -- # case "$op" in 00:03:26.612 15:19:12 -- scripts/common.sh@344 -- # : 1 00:03:26.612 15:19:12 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:26.612 15:19:12 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:26.612 15:19:12 -- scripts/common.sh@364 -- # decimal 1 00:03:26.612 15:19:12 -- scripts/common.sh@352 -- # local d=1 00:03:26.612 15:19:12 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:26.612 15:19:12 -- scripts/common.sh@354 -- # echo 1 00:03:26.612 15:19:12 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:26.612 15:19:12 -- scripts/common.sh@365 -- # decimal 2 00:03:26.612 15:19:12 -- scripts/common.sh@352 -- # local d=2 00:03:26.612 15:19:12 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:26.612 15:19:12 -- scripts/common.sh@354 -- # echo 2 00:03:26.612 15:19:12 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:26.612 15:19:12 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:26.612 15:19:12 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:26.612 15:19:12 -- scripts/common.sh@367 -- # return 0 00:03:26.612 15:19:12 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:26.612 15:19:12 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:26.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.612 --rc genhtml_branch_coverage=1 00:03:26.612 --rc genhtml_function_coverage=1 00:03:26.612 --rc genhtml_legend=1 00:03:26.612 --rc geninfo_all_blocks=1 00:03:26.612 --rc geninfo_unexecuted_blocks=1 00:03:26.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:26.612 ' 00:03:26.612 15:19:12 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:26.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.612 --rc genhtml_branch_coverage=1 00:03:26.612 --rc genhtml_function_coverage=1 00:03:26.612 --rc genhtml_legend=1 00:03:26.612 --rc geninfo_all_blocks=1 00:03:26.612 --rc geninfo_unexecuted_blocks=1 00:03:26.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:26.612 ' 00:03:26.612 15:19:12 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:26.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.612 --rc genhtml_branch_coverage=1 00:03:26.612 --rc genhtml_function_coverage=1 00:03:26.612 --rc genhtml_legend=1 00:03:26.612 --rc geninfo_all_blocks=1 00:03:26.612 --rc geninfo_unexecuted_blocks=1 00:03:26.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:26.612 ' 00:03:26.612 15:19:12 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:26.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:26.612 --rc genhtml_branch_coverage=1 00:03:26.612 --rc genhtml_function_coverage=1 00:03:26.612 --rc genhtml_legend=1 00:03:26.612 --rc geninfo_all_blocks=1 00:03:26.612 --rc geninfo_unexecuted_blocks=1 00:03:26.612 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:26.612 ' 00:03:26.612 15:19:12 -- setup/driver.sh@68 -- # setup reset 00:03:26.612 15:19:12 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:26.612 15:19:12 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:31.890 15:19:16 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:31.890 15:19:16 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:31.890 15:19:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:31.890 15:19:16 -- common/autotest_common.sh@10 -- # set +x 00:03:31.890 ************************************ 00:03:31.890 START TEST guess_driver 00:03:31.890 ************************************ 00:03:31.890 15:19:16 -- common/autotest_common.sh@1114 -- # guess_driver 00:03:31.890 15:19:16 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:31.890 15:19:16 -- setup/driver.sh@47 -- # local fail=0 00:03:31.890 15:19:16 -- setup/driver.sh@49 -- # pick_driver 00:03:31.890 15:19:16 -- setup/driver.sh@36 -- # vfio 00:03:31.890 15:19:16 -- setup/driver.sh@21 -- # local iommu_grups 00:03:31.890 15:19:16 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:31.890 15:19:16 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:31.890 15:19:16 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:31.890 15:19:16 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:31.890 15:19:16 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:03:31.890 15:19:16 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:31.890 15:19:16 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:31.890 15:19:16 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:31.890 15:19:16 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:31.890 15:19:16 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:31.890 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:31.890 15:19:16 -- setup/driver.sh@30 -- # return 0 00:03:31.890 15:19:16 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:31.890 15:19:16 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:31.890 15:19:16 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:31.890 15:19:16 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:31.890 Looking for driver=vfio-pci 00:03:31.890 15:19:16 -- setup/driver.sh@45 -- # setup output config 00:03:31.890 15:19:16 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:31.890 15:19:16 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:31.890 15:19:16 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:35.183 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.183 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:35.184 15:19:20 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:35.184 15:19:20 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:35.184 15:19:20 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.564 15:19:22 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:36.564 15:19:22 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:36.564 15:19:22 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:36.564 15:19:22 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:36.564 15:19:22 -- setup/driver.sh@65 -- # setup reset 00:03:36.564 15:19:22 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:36.564 15:19:22 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:40.760 00:03:40.760 real 0m9.476s 00:03:40.760 user 0m2.326s 00:03:40.760 sys 0m4.780s 00:03:40.760 15:19:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:40.760 15:19:26 -- common/autotest_common.sh@10 -- # set +x 00:03:40.760 ************************************ 00:03:40.760 END TEST guess_driver 00:03:40.760 ************************************ 00:03:40.760 00:03:40.760 real 0m14.186s 00:03:40.760 user 0m3.578s 00:03:40.760 sys 0m7.447s 00:03:40.760 15:19:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:40.760 15:19:26 -- common/autotest_common.sh@10 -- # set +x 00:03:40.760 ************************************ 00:03:40.760 END TEST driver 00:03:40.760 ************************************ 00:03:40.760 15:19:26 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:40.760 15:19:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:40.760 15:19:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:40.760 15:19:26 -- common/autotest_common.sh@10 -- # set +x 00:03:40.760 ************************************ 00:03:40.760 START TEST devices 00:03:40.760 ************************************ 00:03:40.760 15:19:26 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:40.760 * Looking for test storage... 00:03:40.760 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:40.760 15:19:26 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:40.760 15:19:26 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:40.760 15:19:26 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:41.020 15:19:26 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:41.020 15:19:26 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:41.020 15:19:26 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:41.020 15:19:26 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:41.020 15:19:26 -- scripts/common.sh@335 -- # IFS=.-: 00:03:41.020 15:19:26 -- scripts/common.sh@335 -- # read -ra ver1 00:03:41.020 15:19:26 -- scripts/common.sh@336 -- # IFS=.-: 00:03:41.020 15:19:26 -- scripts/common.sh@336 -- # read -ra ver2 00:03:41.020 15:19:26 -- scripts/common.sh@337 -- # local 'op=<' 00:03:41.020 15:19:26 -- scripts/common.sh@339 -- # ver1_l=2 00:03:41.020 15:19:26 -- scripts/common.sh@340 -- # ver2_l=1 00:03:41.020 15:19:26 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:41.020 15:19:26 -- scripts/common.sh@343 -- # case "$op" in 00:03:41.020 15:19:26 -- scripts/common.sh@344 -- # : 1 00:03:41.020 15:19:26 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:41.020 15:19:26 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:41.020 15:19:26 -- scripts/common.sh@364 -- # decimal 1 00:03:41.020 15:19:26 -- scripts/common.sh@352 -- # local d=1 00:03:41.020 15:19:26 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:41.020 15:19:26 -- scripts/common.sh@354 -- # echo 1 00:03:41.020 15:19:26 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:41.020 15:19:26 -- scripts/common.sh@365 -- # decimal 2 00:03:41.020 15:19:26 -- scripts/common.sh@352 -- # local d=2 00:03:41.020 15:19:26 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:41.020 15:19:26 -- scripts/common.sh@354 -- # echo 2 00:03:41.020 15:19:26 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:41.020 15:19:26 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:41.020 15:19:26 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:41.020 15:19:26 -- scripts/common.sh@367 -- # return 0 00:03:41.020 15:19:26 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:41.020 15:19:26 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:41.020 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:41.020 --rc genhtml_branch_coverage=1 00:03:41.020 --rc genhtml_function_coverage=1 00:03:41.020 --rc genhtml_legend=1 00:03:41.020 --rc geninfo_all_blocks=1 00:03:41.020 --rc geninfo_unexecuted_blocks=1 00:03:41.020 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:41.020 ' 00:03:41.020 15:19:26 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:41.020 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:41.020 --rc genhtml_branch_coverage=1 00:03:41.020 --rc genhtml_function_coverage=1 00:03:41.020 --rc genhtml_legend=1 00:03:41.020 --rc geninfo_all_blocks=1 00:03:41.020 --rc geninfo_unexecuted_blocks=1 00:03:41.020 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:41.021 ' 00:03:41.021 15:19:26 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:41.021 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:41.021 --rc genhtml_branch_coverage=1 00:03:41.021 --rc genhtml_function_coverage=1 00:03:41.021 --rc genhtml_legend=1 00:03:41.021 --rc geninfo_all_blocks=1 00:03:41.021 --rc geninfo_unexecuted_blocks=1 00:03:41.021 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:41.021 ' 00:03:41.021 15:19:26 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:41.021 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:41.021 --rc genhtml_branch_coverage=1 00:03:41.021 --rc genhtml_function_coverage=1 00:03:41.021 --rc genhtml_legend=1 00:03:41.021 --rc geninfo_all_blocks=1 00:03:41.021 --rc geninfo_unexecuted_blocks=1 00:03:41.021 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:41.021 ' 00:03:41.021 15:19:26 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:03:41.021 15:19:26 -- setup/devices.sh@192 -- # setup reset 00:03:41.021 15:19:26 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:41.021 15:19:26 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:44.317 15:19:30 -- setup/devices.sh@194 -- # get_zoned_devs 00:03:44.317 15:19:30 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:44.317 15:19:30 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:44.317 15:19:30 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:44.317 15:19:30 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:44.317 15:19:30 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:44.317 15:19:30 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:44.317 15:19:30 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:44.317 15:19:30 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:44.317 15:19:30 -- setup/devices.sh@196 -- # blocks=() 00:03:44.317 15:19:30 -- setup/devices.sh@196 -- # declare -a blocks 00:03:44.317 15:19:30 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:03:44.317 15:19:30 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:03:44.317 15:19:30 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:03:44.317 15:19:30 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:44.317 15:19:30 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:03:44.317 15:19:30 -- setup/devices.sh@201 -- # ctrl=nvme0 00:03:44.317 15:19:30 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:03:44.317 15:19:30 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:44.317 15:19:30 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:03:44.317 15:19:30 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:03:44.317 15:19:30 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:03:44.317 No valid GPT data, bailing 00:03:44.317 15:19:30 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:44.317 15:19:30 -- scripts/common.sh@393 -- # pt= 00:03:44.317 15:19:30 -- scripts/common.sh@394 -- # return 1 00:03:44.317 15:19:30 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:03:44.317 15:19:30 -- setup/common.sh@76 -- # local dev=nvme0n1 00:03:44.317 15:19:30 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:03:44.317 15:19:30 -- setup/common.sh@80 -- # echo 1600321314816 00:03:44.317 15:19:30 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:03:44.317 15:19:30 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:44.317 15:19:30 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:03:44.317 15:19:30 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:03:44.317 15:19:30 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:03:44.317 15:19:30 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:03:44.317 15:19:30 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:44.317 15:19:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:44.317 15:19:30 -- common/autotest_common.sh@10 -- # set +x 00:03:44.317 ************************************ 00:03:44.317 START TEST nvme_mount 00:03:44.317 ************************************ 00:03:44.317 15:19:30 -- common/autotest_common.sh@1114 -- # nvme_mount 00:03:44.317 15:19:30 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:03:44.317 15:19:30 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:03:44.317 15:19:30 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:44.317 15:19:30 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:44.317 15:19:30 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:03:44.317 15:19:30 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:44.317 15:19:30 -- setup/common.sh@40 -- # local part_no=1 00:03:44.317 15:19:30 -- setup/common.sh@41 -- # local size=1073741824 00:03:44.317 15:19:30 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:44.317 15:19:30 -- setup/common.sh@44 -- # parts=() 00:03:44.317 15:19:30 -- setup/common.sh@44 -- # local parts 00:03:44.317 15:19:30 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:44.317 15:19:30 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:44.317 15:19:30 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:44.317 15:19:30 -- setup/common.sh@46 -- # (( part++ )) 00:03:44.317 15:19:30 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:44.317 15:19:30 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:44.317 15:19:30 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:03:44.317 15:19:30 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:45.337 Creating new GPT entries in memory. 00:03:45.337 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:45.337 other utilities. 00:03:45.337 15:19:31 -- setup/common.sh@57 -- # (( part = 1 )) 00:03:45.337 15:19:31 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:45.337 15:19:31 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:45.337 15:19:31 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:45.337 15:19:31 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:46.723 Creating new GPT entries in memory. 00:03:46.723 The operation has completed successfully. 00:03:46.723 15:19:32 -- setup/common.sh@57 -- # (( part++ )) 00:03:46.723 15:19:32 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:46.723 15:19:32 -- setup/common.sh@62 -- # wait 1554833 00:03:46.723 15:19:32 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:46.723 15:19:32 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:03:46.723 15:19:32 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:46.723 15:19:32 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:03:46.723 15:19:32 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:03:46.723 15:19:32 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:46.723 15:19:32 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:46.723 15:19:32 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:46.723 15:19:32 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:03:46.723 15:19:32 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:46.723 15:19:32 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:46.723 15:19:32 -- setup/devices.sh@53 -- # local found=0 00:03:46.723 15:19:32 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:46.723 15:19:32 -- setup/devices.sh@56 -- # : 00:03:46.723 15:19:32 -- setup/devices.sh@59 -- # local pci status 00:03:46.723 15:19:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:46.723 15:19:32 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:46.723 15:19:32 -- setup/devices.sh@47 -- # setup output config 00:03:46.723 15:19:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.723 15:19:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:49.257 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.257 15:19:35 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:03:49.257 15:19:35 -- setup/devices.sh@63 -- # found=1 00:03:49.257 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.257 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.257 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.257 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.257 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:49.518 15:19:35 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:49.518 15:19:35 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:49.518 15:19:35 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.518 15:19:35 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:49.518 15:19:35 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:49.518 15:19:35 -- setup/devices.sh@110 -- # cleanup_nvme 00:03:49.518 15:19:35 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.518 15:19:35 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.518 15:19:35 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:03:49.518 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:49.518 15:19:35 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:49.518 15:19:35 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:49.777 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:03:49.777 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:03:49.777 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:03:49.777 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:03:49.777 15:19:35 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:03:49.777 15:19:35 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:03:49.777 15:19:35 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:49.777 15:19:35 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:03:49.777 15:19:35 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:03:49.777 15:19:35 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:50.034 15:19:35 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:50.034 15:19:35 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:50.034 15:19:35 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:03:50.034 15:19:35 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:50.034 15:19:35 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:50.034 15:19:35 -- setup/devices.sh@53 -- # local found=0 00:03:50.034 15:19:35 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:50.034 15:19:35 -- setup/devices.sh@56 -- # : 00:03:50.034 15:19:35 -- setup/devices.sh@59 -- # local pci status 00:03:50.034 15:19:35 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:50.034 15:19:35 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:50.034 15:19:35 -- setup/devices.sh@47 -- # setup output config 00:03:50.034 15:19:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.034 15:19:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:03:53.321 15:19:38 -- setup/devices.sh@63 -- # found=1 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:53.321 15:19:38 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:53.321 15:19:38 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:53.321 15:19:38 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:53.321 15:19:38 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:53.321 15:19:38 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:53.321 15:19:38 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:03:53.321 15:19:38 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:53.321 15:19:38 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:03:53.321 15:19:38 -- setup/devices.sh@50 -- # local mount_point= 00:03:53.321 15:19:38 -- setup/devices.sh@51 -- # local test_file= 00:03:53.321 15:19:38 -- setup/devices.sh@53 -- # local found=0 00:03:53.321 15:19:38 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:03:53.321 15:19:38 -- setup/devices.sh@59 -- # local pci status 00:03:53.321 15:19:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:53.321 15:19:38 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:53.321 15:19:38 -- setup/devices.sh@47 -- # setup output config 00:03:53.321 15:19:38 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.321 15:19:38 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:03:56.604 15:19:42 -- setup/devices.sh@63 -- # found=1 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:56.604 15:19:42 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:56.604 15:19:42 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:03:56.604 15:19:42 -- setup/devices.sh@68 -- # return 0 00:03:56.604 15:19:42 -- setup/devices.sh@128 -- # cleanup_nvme 00:03:56.604 15:19:42 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:56.604 15:19:42 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:56.604 15:19:42 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:56.604 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:56.604 00:03:56.604 real 0m12.169s 00:03:56.604 user 0m3.422s 00:03:56.604 sys 0m6.581s 00:03:56.604 15:19:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:56.604 15:19:42 -- common/autotest_common.sh@10 -- # set +x 00:03:56.604 ************************************ 00:03:56.604 END TEST nvme_mount 00:03:56.604 ************************************ 00:03:56.604 15:19:42 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:03:56.604 15:19:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:56.604 15:19:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:56.604 15:19:42 -- common/autotest_common.sh@10 -- # set +x 00:03:56.604 ************************************ 00:03:56.604 START TEST dm_mount 00:03:56.604 ************************************ 00:03:56.604 15:19:42 -- common/autotest_common.sh@1114 -- # dm_mount 00:03:56.604 15:19:42 -- setup/devices.sh@144 -- # pv=nvme0n1 00:03:56.604 15:19:42 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:03:56.604 15:19:42 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:03:56.604 15:19:42 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:03:56.604 15:19:42 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:56.604 15:19:42 -- setup/common.sh@40 -- # local part_no=2 00:03:56.604 15:19:42 -- setup/common.sh@41 -- # local size=1073741824 00:03:56.604 15:19:42 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:56.604 15:19:42 -- setup/common.sh@44 -- # parts=() 00:03:56.604 15:19:42 -- setup/common.sh@44 -- # local parts 00:03:56.604 15:19:42 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:56.604 15:19:42 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:56.604 15:19:42 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:56.604 15:19:42 -- setup/common.sh@46 -- # (( part++ )) 00:03:56.604 15:19:42 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:56.604 15:19:42 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:56.604 15:19:42 -- setup/common.sh@46 -- # (( part++ )) 00:03:56.604 15:19:42 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:56.604 15:19:42 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:56.604 15:19:42 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:03:56.604 15:19:42 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:57.542 Creating new GPT entries in memory. 00:03:57.542 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:57.542 other utilities. 00:03:57.542 15:19:43 -- setup/common.sh@57 -- # (( part = 1 )) 00:03:57.542 15:19:43 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:57.542 15:19:43 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:57.542 15:19:43 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:57.542 15:19:43 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:58.922 Creating new GPT entries in memory. 00:03:58.922 The operation has completed successfully. 00:03:58.922 15:19:44 -- setup/common.sh@57 -- # (( part++ )) 00:03:58.922 15:19:44 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:58.922 15:19:44 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:58.922 15:19:44 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:58.922 15:19:44 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:03:59.863 The operation has completed successfully. 00:03:59.863 15:19:45 -- setup/common.sh@57 -- # (( part++ )) 00:03:59.863 15:19:45 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:59.863 15:19:45 -- setup/common.sh@62 -- # wait 1559308 00:03:59.863 15:19:45 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:03:59.863 15:19:45 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:59.863 15:19:45 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:59.863 15:19:45 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:03:59.863 15:19:45 -- setup/devices.sh@160 -- # for t in {1..5} 00:03:59.863 15:19:45 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:59.863 15:19:45 -- setup/devices.sh@161 -- # break 00:03:59.863 15:19:45 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:59.863 15:19:45 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:03:59.863 15:19:45 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:03:59.863 15:19:45 -- setup/devices.sh@166 -- # dm=dm-0 00:03:59.863 15:19:45 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:03:59.863 15:19:45 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:03:59.863 15:19:45 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:59.863 15:19:45 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:03:59.863 15:19:45 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:59.863 15:19:45 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:59.863 15:19:45 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:03:59.863 15:19:45 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:59.863 15:19:45 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:59.863 15:19:45 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:59.863 15:19:45 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:03:59.863 15:19:45 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:59.863 15:19:45 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:59.863 15:19:45 -- setup/devices.sh@53 -- # local found=0 00:03:59.863 15:19:45 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:03:59.863 15:19:45 -- setup/devices.sh@56 -- # : 00:03:59.863 15:19:45 -- setup/devices.sh@59 -- # local pci status 00:03:59.863 15:19:45 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:59.863 15:19:45 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:59.863 15:19:45 -- setup/devices.sh@47 -- # setup output config 00:03:59.863 15:19:45 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:59.863 15:19:45 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:03.275 15:19:48 -- setup/devices.sh@63 -- # found=1 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:03.275 15:19:48 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:03.275 15:19:48 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:03.275 15:19:48 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:03.275 15:19:48 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:03.275 15:19:48 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:03.275 15:19:48 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:03.275 15:19:48 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:03.275 15:19:48 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:03.275 15:19:48 -- setup/devices.sh@50 -- # local mount_point= 00:04:03.275 15:19:48 -- setup/devices.sh@51 -- # local test_file= 00:04:03.275 15:19:48 -- setup/devices.sh@53 -- # local found=0 00:04:03.275 15:19:48 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:03.275 15:19:48 -- setup/devices.sh@59 -- # local pci status 00:04:03.275 15:19:48 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:03.275 15:19:48 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:03.275 15:19:48 -- setup/devices.sh@47 -- # setup output config 00:04:03.275 15:19:48 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:03.275 15:19:48 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:06.568 15:19:51 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:51 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:06.568 15:19:51 -- setup/devices.sh@63 -- # found=1 00:04:06.568 15:19:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:51 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:51 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:06.568 15:19:52 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:06.568 15:19:52 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:06.568 15:19:52 -- setup/devices.sh@68 -- # return 0 00:04:06.568 15:19:52 -- setup/devices.sh@187 -- # cleanup_dm 00:04:06.568 15:19:52 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:06.568 15:19:52 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:06.568 15:19:52 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:06.568 15:19:52 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:06.568 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:06.568 15:19:52 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:06.568 00:04:06.568 real 0m9.972s 00:04:06.568 user 0m2.498s 00:04:06.568 sys 0m4.573s 00:04:06.568 15:19:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:06.568 15:19:52 -- common/autotest_common.sh@10 -- # set +x 00:04:06.568 ************************************ 00:04:06.568 END TEST dm_mount 00:04:06.568 ************************************ 00:04:06.568 15:19:52 -- setup/devices.sh@1 -- # cleanup 00:04:06.568 15:19:52 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:06.568 15:19:52 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:06.568 15:19:52 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:06.568 15:19:52 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:06.568 15:19:52 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:06.827 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:06.827 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:06.827 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:06.827 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:06.827 15:19:52 -- setup/devices.sh@12 -- # cleanup_dm 00:04:06.827 15:19:52 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:06.827 15:19:52 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:06.827 15:19:52 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:06.827 15:19:52 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:06.827 15:19:52 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:06.827 15:19:52 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:06.827 00:04:06.827 real 0m26.161s 00:04:06.827 user 0m7.240s 00:04:06.827 sys 0m13.731s 00:04:06.827 15:19:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:06.827 15:19:52 -- common/autotest_common.sh@10 -- # set +x 00:04:06.827 ************************************ 00:04:06.827 END TEST devices 00:04:06.827 ************************************ 00:04:07.086 00:04:07.086 real 1m30.840s 00:04:07.086 user 0m27.721s 00:04:07.086 sys 0m51.745s 00:04:07.086 15:19:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:07.086 15:19:52 -- common/autotest_common.sh@10 -- # set +x 00:04:07.086 ************************************ 00:04:07.086 END TEST setup.sh 00:04:07.086 ************************************ 00:04:07.086 15:19:52 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:10.374 Hugepages 00:04:10.374 node hugesize free / total 00:04:10.374 node0 1048576kB 0 / 0 00:04:10.374 node0 2048kB 2048 / 2048 00:04:10.374 node1 1048576kB 0 / 0 00:04:10.374 node1 2048kB 0 / 0 00:04:10.374 00:04:10.374 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:10.374 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:10.374 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:10.374 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:10.374 15:19:56 -- spdk/autotest.sh@128 -- # uname -s 00:04:10.374 15:19:56 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:04:10.374 15:19:56 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:04:10.374 15:19:56 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:14.563 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:14.563 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:15.942 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:15.942 15:20:01 -- common/autotest_common.sh@1527 -- # sleep 1 00:04:16.882 15:20:02 -- common/autotest_common.sh@1528 -- # bdfs=() 00:04:16.882 15:20:02 -- common/autotest_common.sh@1528 -- # local bdfs 00:04:16.882 15:20:02 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:04:16.882 15:20:02 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:04:16.882 15:20:02 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:16.882 15:20:02 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:16.882 15:20:02 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:16.882 15:20:02 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:16.882 15:20:02 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:16.882 15:20:02 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:16.882 15:20:02 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:16.882 15:20:02 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:20.173 Waiting for block devices as requested 00:04:20.173 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:20.173 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:20.173 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:20.173 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:20.173 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:20.173 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:20.173 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:20.432 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:20.432 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:20.432 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:20.432 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:20.692 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:20.692 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:20.692 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:20.952 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:20.952 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:21.211 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:21.211 15:20:06 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:04:21.211 15:20:06 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:04:21.211 15:20:06 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:21.211 15:20:06 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:04:21.211 15:20:06 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:04:21.211 15:20:06 -- common/autotest_common.sh@1540 -- # grep oacs 00:04:21.212 15:20:06 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:21.212 15:20:06 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:04:21.212 15:20:06 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:04:21.212 15:20:06 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:04:21.212 15:20:06 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:04:21.212 15:20:06 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:04:21.212 15:20:06 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:04:21.212 15:20:07 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:04:21.212 15:20:07 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:04:21.212 15:20:07 -- common/autotest_common.sh@1552 -- # continue 00:04:21.212 15:20:07 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:04:21.212 15:20:07 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:21.212 15:20:07 -- common/autotest_common.sh@10 -- # set +x 00:04:21.470 15:20:07 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:04:21.470 15:20:07 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:21.471 15:20:07 -- common/autotest_common.sh@10 -- # set +x 00:04:21.471 15:20:07 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:24.760 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:24.760 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:24.761 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:26.146 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:26.405 15:20:11 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:04:26.405 15:20:11 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:26.405 15:20:11 -- common/autotest_common.sh@10 -- # set +x 00:04:26.405 15:20:12 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:04:26.405 15:20:12 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:04:26.405 15:20:12 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:04:26.405 15:20:12 -- common/autotest_common.sh@1572 -- # bdfs=() 00:04:26.405 15:20:12 -- common/autotest_common.sh@1572 -- # local bdfs 00:04:26.405 15:20:12 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:04:26.405 15:20:12 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:26.405 15:20:12 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:26.405 15:20:12 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:26.405 15:20:12 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:26.405 15:20:12 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:26.405 15:20:12 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:26.405 15:20:12 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:26.405 15:20:12 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:04:26.405 15:20:12 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:26.405 15:20:12 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:04:26.405 15:20:12 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:26.405 15:20:12 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:04:26.405 15:20:12 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:04:26.405 15:20:12 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:04:26.405 15:20:12 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=1569032 00:04:26.405 15:20:12 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:26.405 15:20:12 -- common/autotest_common.sh@1593 -- # waitforlisten 1569032 00:04:26.405 15:20:12 -- common/autotest_common.sh@829 -- # '[' -z 1569032 ']' 00:04:26.405 15:20:12 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:26.405 15:20:12 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:26.405 15:20:12 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:26.405 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:26.405 15:20:12 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:26.405 15:20:12 -- common/autotest_common.sh@10 -- # set +x 00:04:26.405 [2024-12-02 15:20:12.161891] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:26.405 [2024-12-02 15:20:12.161952] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1569032 ] 00:04:26.405 EAL: No free 2048 kB hugepages reported on node 1 00:04:26.665 [2024-12-02 15:20:12.230429] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:26.665 [2024-12-02 15:20:12.300066] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:26.665 [2024-12-02 15:20:12.300176] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:27.234 15:20:12 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:27.234 15:20:12 -- common/autotest_common.sh@862 -- # return 0 00:04:27.234 15:20:12 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:04:27.234 15:20:12 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:04:27.234 15:20:12 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:04:30.525 nvme0n1 00:04:30.525 15:20:15 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:30.525 [2024-12-02 15:20:16.154994] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:30.525 request: 00:04:30.525 { 00:04:30.525 "nvme_ctrlr_name": "nvme0", 00:04:30.525 "password": "test", 00:04:30.525 "method": "bdev_nvme_opal_revert", 00:04:30.525 "req_id": 1 00:04:30.525 } 00:04:30.525 Got JSON-RPC error response 00:04:30.525 response: 00:04:30.525 { 00:04:30.525 "code": -32602, 00:04:30.525 "message": "Invalid parameters" 00:04:30.525 } 00:04:30.525 15:20:16 -- common/autotest_common.sh@1599 -- # true 00:04:30.525 15:20:16 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:04:30.525 15:20:16 -- common/autotest_common.sh@1603 -- # killprocess 1569032 00:04:30.525 15:20:16 -- common/autotest_common.sh@936 -- # '[' -z 1569032 ']' 00:04:30.525 15:20:16 -- common/autotest_common.sh@940 -- # kill -0 1569032 00:04:30.525 15:20:16 -- common/autotest_common.sh@941 -- # uname 00:04:30.525 15:20:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:30.525 15:20:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1569032 00:04:30.525 15:20:16 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:30.525 15:20:16 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:30.525 15:20:16 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1569032' 00:04:30.525 killing process with pid 1569032 00:04:30.525 15:20:16 -- common/autotest_common.sh@955 -- # kill 1569032 00:04:30.525 15:20:16 -- common/autotest_common.sh@960 -- # wait 1569032 00:04:33.063 15:20:18 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:04:33.063 15:20:18 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:04:33.063 15:20:18 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:04:33.063 15:20:18 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:04:33.063 15:20:18 -- spdk/autotest.sh@160 -- # timing_enter lib 00:04:33.063 15:20:18 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:33.063 15:20:18 -- common/autotest_common.sh@10 -- # set +x 00:04:33.063 15:20:18 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:33.063 15:20:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:33.063 15:20:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:33.063 15:20:18 -- common/autotest_common.sh@10 -- # set +x 00:04:33.063 ************************************ 00:04:33.063 START TEST env 00:04:33.064 ************************************ 00:04:33.064 15:20:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:33.064 * Looking for test storage... 00:04:33.064 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:04:33.064 15:20:18 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:33.064 15:20:18 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:33.064 15:20:18 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:33.064 15:20:18 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:33.064 15:20:18 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:33.064 15:20:18 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:33.064 15:20:18 -- scripts/common.sh@335 -- # IFS=.-: 00:04:33.064 15:20:18 -- scripts/common.sh@335 -- # read -ra ver1 00:04:33.064 15:20:18 -- scripts/common.sh@336 -- # IFS=.-: 00:04:33.064 15:20:18 -- scripts/common.sh@336 -- # read -ra ver2 00:04:33.064 15:20:18 -- scripts/common.sh@337 -- # local 'op=<' 00:04:33.064 15:20:18 -- scripts/common.sh@339 -- # ver1_l=2 00:04:33.064 15:20:18 -- scripts/common.sh@340 -- # ver2_l=1 00:04:33.064 15:20:18 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:33.064 15:20:18 -- scripts/common.sh@343 -- # case "$op" in 00:04:33.064 15:20:18 -- scripts/common.sh@344 -- # : 1 00:04:33.064 15:20:18 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:33.064 15:20:18 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:33.064 15:20:18 -- scripts/common.sh@364 -- # decimal 1 00:04:33.064 15:20:18 -- scripts/common.sh@352 -- # local d=1 00:04:33.064 15:20:18 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:33.064 15:20:18 -- scripts/common.sh@354 -- # echo 1 00:04:33.064 15:20:18 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:33.064 15:20:18 -- scripts/common.sh@365 -- # decimal 2 00:04:33.064 15:20:18 -- scripts/common.sh@352 -- # local d=2 00:04:33.064 15:20:18 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:33.064 15:20:18 -- scripts/common.sh@354 -- # echo 2 00:04:33.064 15:20:18 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:33.064 15:20:18 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:33.064 15:20:18 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:33.064 15:20:18 -- scripts/common.sh@367 -- # return 0 00:04:33.064 15:20:18 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:33.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:33.064 --rc genhtml_branch_coverage=1 00:04:33.064 --rc genhtml_function_coverage=1 00:04:33.064 --rc genhtml_legend=1 00:04:33.064 --rc geninfo_all_blocks=1 00:04:33.064 --rc geninfo_unexecuted_blocks=1 00:04:33.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:33.064 ' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:33.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:33.064 --rc genhtml_branch_coverage=1 00:04:33.064 --rc genhtml_function_coverage=1 00:04:33.064 --rc genhtml_legend=1 00:04:33.064 --rc geninfo_all_blocks=1 00:04:33.064 --rc geninfo_unexecuted_blocks=1 00:04:33.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:33.064 ' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:33.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:33.064 --rc genhtml_branch_coverage=1 00:04:33.064 --rc genhtml_function_coverage=1 00:04:33.064 --rc genhtml_legend=1 00:04:33.064 --rc geninfo_all_blocks=1 00:04:33.064 --rc geninfo_unexecuted_blocks=1 00:04:33.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:33.064 ' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:33.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:33.064 --rc genhtml_branch_coverage=1 00:04:33.064 --rc genhtml_function_coverage=1 00:04:33.064 --rc genhtml_legend=1 00:04:33.064 --rc geninfo_all_blocks=1 00:04:33.064 --rc geninfo_unexecuted_blocks=1 00:04:33.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:33.064 ' 00:04:33.064 15:20:18 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:33.064 15:20:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:33.064 15:20:18 -- common/autotest_common.sh@10 -- # set +x 00:04:33.064 ************************************ 00:04:33.064 START TEST env_memory 00:04:33.064 ************************************ 00:04:33.064 15:20:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:33.064 00:04:33.064 00:04:33.064 CUnit - A unit testing framework for C - Version 2.1-3 00:04:33.064 http://cunit.sourceforge.net/ 00:04:33.064 00:04:33.064 00:04:33.064 Suite: memory 00:04:33.064 Test: alloc and free memory map ...[2024-12-02 15:20:18.584515] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:33.064 passed 00:04:33.064 Test: mem map translation ...[2024-12-02 15:20:18.597260] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:33.064 [2024-12-02 15:20:18.597280] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:33.064 [2024-12-02 15:20:18.597309] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:33.064 [2024-12-02 15:20:18.597318] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:33.064 passed 00:04:33.064 Test: mem map registration ...[2024-12-02 15:20:18.617313] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:33.064 [2024-12-02 15:20:18.617329] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:33.064 passed 00:04:33.064 Test: mem map adjacent registrations ...passed 00:04:33.064 00:04:33.064 Run Summary: Type Total Ran Passed Failed Inactive 00:04:33.064 suites 1 1 n/a 0 0 00:04:33.064 tests 4 4 4 0 0 00:04:33.064 asserts 152 152 152 0 n/a 00:04:33.064 00:04:33.064 Elapsed time = 0.081 seconds 00:04:33.064 00:04:33.064 real 0m0.093s 00:04:33.064 user 0m0.082s 00:04:33.064 sys 0m0.011s 00:04:33.064 15:20:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:33.064 15:20:18 -- common/autotest_common.sh@10 -- # set +x 00:04:33.064 ************************************ 00:04:33.064 END TEST env_memory 00:04:33.064 ************************************ 00:04:33.064 15:20:18 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:33.064 15:20:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:33.064 15:20:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:33.064 15:20:18 -- common/autotest_common.sh@10 -- # set +x 00:04:33.064 ************************************ 00:04:33.064 START TEST env_vtophys 00:04:33.064 ************************************ 00:04:33.064 15:20:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:33.064 EAL: lib.eal log level changed from notice to debug 00:04:33.064 EAL: Detected lcore 0 as core 0 on socket 0 00:04:33.064 EAL: Detected lcore 1 as core 1 on socket 0 00:04:33.064 EAL: Detected lcore 2 as core 2 on socket 0 00:04:33.064 EAL: Detected lcore 3 as core 3 on socket 0 00:04:33.064 EAL: Detected lcore 4 as core 4 on socket 0 00:04:33.064 EAL: Detected lcore 5 as core 5 on socket 0 00:04:33.064 EAL: Detected lcore 6 as core 6 on socket 0 00:04:33.064 EAL: Detected lcore 7 as core 8 on socket 0 00:04:33.064 EAL: Detected lcore 8 as core 9 on socket 0 00:04:33.064 EAL: Detected lcore 9 as core 10 on socket 0 00:04:33.064 EAL: Detected lcore 10 as core 11 on socket 0 00:04:33.064 EAL: Detected lcore 11 as core 12 on socket 0 00:04:33.064 EAL: Detected lcore 12 as core 13 on socket 0 00:04:33.064 EAL: Detected lcore 13 as core 14 on socket 0 00:04:33.064 EAL: Detected lcore 14 as core 16 on socket 0 00:04:33.064 EAL: Detected lcore 15 as core 17 on socket 0 00:04:33.064 EAL: Detected lcore 16 as core 18 on socket 0 00:04:33.064 EAL: Detected lcore 17 as core 19 on socket 0 00:04:33.064 EAL: Detected lcore 18 as core 20 on socket 0 00:04:33.064 EAL: Detected lcore 19 as core 21 on socket 0 00:04:33.064 EAL: Detected lcore 20 as core 22 on socket 0 00:04:33.064 EAL: Detected lcore 21 as core 24 on socket 0 00:04:33.064 EAL: Detected lcore 22 as core 25 on socket 0 00:04:33.064 EAL: Detected lcore 23 as core 26 on socket 0 00:04:33.064 EAL: Detected lcore 24 as core 27 on socket 0 00:04:33.064 EAL: Detected lcore 25 as core 28 on socket 0 00:04:33.064 EAL: Detected lcore 26 as core 29 on socket 0 00:04:33.064 EAL: Detected lcore 27 as core 30 on socket 0 00:04:33.064 EAL: Detected lcore 28 as core 0 on socket 1 00:04:33.065 EAL: Detected lcore 29 as core 1 on socket 1 00:04:33.065 EAL: Detected lcore 30 as core 2 on socket 1 00:04:33.065 EAL: Detected lcore 31 as core 3 on socket 1 00:04:33.065 EAL: Detected lcore 32 as core 4 on socket 1 00:04:33.065 EAL: Detected lcore 33 as core 5 on socket 1 00:04:33.065 EAL: Detected lcore 34 as core 6 on socket 1 00:04:33.065 EAL: Detected lcore 35 as core 8 on socket 1 00:04:33.065 EAL: Detected lcore 36 as core 9 on socket 1 00:04:33.065 EAL: Detected lcore 37 as core 10 on socket 1 00:04:33.065 EAL: Detected lcore 38 as core 11 on socket 1 00:04:33.065 EAL: Detected lcore 39 as core 12 on socket 1 00:04:33.065 EAL: Detected lcore 40 as core 13 on socket 1 00:04:33.065 EAL: Detected lcore 41 as core 14 on socket 1 00:04:33.065 EAL: Detected lcore 42 as core 16 on socket 1 00:04:33.065 EAL: Detected lcore 43 as core 17 on socket 1 00:04:33.065 EAL: Detected lcore 44 as core 18 on socket 1 00:04:33.065 EAL: Detected lcore 45 as core 19 on socket 1 00:04:33.065 EAL: Detected lcore 46 as core 20 on socket 1 00:04:33.065 EAL: Detected lcore 47 as core 21 on socket 1 00:04:33.065 EAL: Detected lcore 48 as core 22 on socket 1 00:04:33.065 EAL: Detected lcore 49 as core 24 on socket 1 00:04:33.065 EAL: Detected lcore 50 as core 25 on socket 1 00:04:33.065 EAL: Detected lcore 51 as core 26 on socket 1 00:04:33.065 EAL: Detected lcore 52 as core 27 on socket 1 00:04:33.065 EAL: Detected lcore 53 as core 28 on socket 1 00:04:33.065 EAL: Detected lcore 54 as core 29 on socket 1 00:04:33.065 EAL: Detected lcore 55 as core 30 on socket 1 00:04:33.065 EAL: Detected lcore 56 as core 0 on socket 0 00:04:33.065 EAL: Detected lcore 57 as core 1 on socket 0 00:04:33.065 EAL: Detected lcore 58 as core 2 on socket 0 00:04:33.065 EAL: Detected lcore 59 as core 3 on socket 0 00:04:33.065 EAL: Detected lcore 60 as core 4 on socket 0 00:04:33.065 EAL: Detected lcore 61 as core 5 on socket 0 00:04:33.065 EAL: Detected lcore 62 as core 6 on socket 0 00:04:33.065 EAL: Detected lcore 63 as core 8 on socket 0 00:04:33.065 EAL: Detected lcore 64 as core 9 on socket 0 00:04:33.065 EAL: Detected lcore 65 as core 10 on socket 0 00:04:33.065 EAL: Detected lcore 66 as core 11 on socket 0 00:04:33.065 EAL: Detected lcore 67 as core 12 on socket 0 00:04:33.065 EAL: Detected lcore 68 as core 13 on socket 0 00:04:33.065 EAL: Detected lcore 69 as core 14 on socket 0 00:04:33.065 EAL: Detected lcore 70 as core 16 on socket 0 00:04:33.065 EAL: Detected lcore 71 as core 17 on socket 0 00:04:33.065 EAL: Detected lcore 72 as core 18 on socket 0 00:04:33.065 EAL: Detected lcore 73 as core 19 on socket 0 00:04:33.065 EAL: Detected lcore 74 as core 20 on socket 0 00:04:33.065 EAL: Detected lcore 75 as core 21 on socket 0 00:04:33.065 EAL: Detected lcore 76 as core 22 on socket 0 00:04:33.065 EAL: Detected lcore 77 as core 24 on socket 0 00:04:33.065 EAL: Detected lcore 78 as core 25 on socket 0 00:04:33.065 EAL: Detected lcore 79 as core 26 on socket 0 00:04:33.065 EAL: Detected lcore 80 as core 27 on socket 0 00:04:33.065 EAL: Detected lcore 81 as core 28 on socket 0 00:04:33.065 EAL: Detected lcore 82 as core 29 on socket 0 00:04:33.065 EAL: Detected lcore 83 as core 30 on socket 0 00:04:33.065 EAL: Detected lcore 84 as core 0 on socket 1 00:04:33.065 EAL: Detected lcore 85 as core 1 on socket 1 00:04:33.065 EAL: Detected lcore 86 as core 2 on socket 1 00:04:33.065 EAL: Detected lcore 87 as core 3 on socket 1 00:04:33.065 EAL: Detected lcore 88 as core 4 on socket 1 00:04:33.065 EAL: Detected lcore 89 as core 5 on socket 1 00:04:33.065 EAL: Detected lcore 90 as core 6 on socket 1 00:04:33.065 EAL: Detected lcore 91 as core 8 on socket 1 00:04:33.065 EAL: Detected lcore 92 as core 9 on socket 1 00:04:33.065 EAL: Detected lcore 93 as core 10 on socket 1 00:04:33.065 EAL: Detected lcore 94 as core 11 on socket 1 00:04:33.065 EAL: Detected lcore 95 as core 12 on socket 1 00:04:33.065 EAL: Detected lcore 96 as core 13 on socket 1 00:04:33.065 EAL: Detected lcore 97 as core 14 on socket 1 00:04:33.065 EAL: Detected lcore 98 as core 16 on socket 1 00:04:33.065 EAL: Detected lcore 99 as core 17 on socket 1 00:04:33.065 EAL: Detected lcore 100 as core 18 on socket 1 00:04:33.065 EAL: Detected lcore 101 as core 19 on socket 1 00:04:33.065 EAL: Detected lcore 102 as core 20 on socket 1 00:04:33.065 EAL: Detected lcore 103 as core 21 on socket 1 00:04:33.065 EAL: Detected lcore 104 as core 22 on socket 1 00:04:33.065 EAL: Detected lcore 105 as core 24 on socket 1 00:04:33.065 EAL: Detected lcore 106 as core 25 on socket 1 00:04:33.065 EAL: Detected lcore 107 as core 26 on socket 1 00:04:33.065 EAL: Detected lcore 108 as core 27 on socket 1 00:04:33.065 EAL: Detected lcore 109 as core 28 on socket 1 00:04:33.065 EAL: Detected lcore 110 as core 29 on socket 1 00:04:33.065 EAL: Detected lcore 111 as core 30 on socket 1 00:04:33.065 EAL: Maximum logical cores by configuration: 128 00:04:33.065 EAL: Detected CPU lcores: 112 00:04:33.065 EAL: Detected NUMA nodes: 2 00:04:33.065 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:33.065 EAL: Checking presence of .so 'librte_eal.so.24' 00:04:33.065 EAL: Checking presence of .so 'librte_eal.so' 00:04:33.065 EAL: Detected static linkage of DPDK 00:04:33.065 EAL: No shared files mode enabled, IPC will be disabled 00:04:33.065 EAL: Bus pci wants IOVA as 'DC' 00:04:33.065 EAL: Buses did not request a specific IOVA mode. 00:04:33.065 EAL: IOMMU is available, selecting IOVA as VA mode. 00:04:33.065 EAL: Selected IOVA mode 'VA' 00:04:33.065 EAL: No free 2048 kB hugepages reported on node 1 00:04:33.065 EAL: Probing VFIO support... 00:04:33.065 EAL: IOMMU type 1 (Type 1) is supported 00:04:33.065 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:33.065 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:33.065 EAL: VFIO support initialized 00:04:33.065 EAL: Ask a virtual area of 0x2e000 bytes 00:04:33.065 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:33.065 EAL: Setting up physically contiguous memory... 00:04:33.065 EAL: Setting maximum number of open files to 524288 00:04:33.065 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:33.065 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:33.065 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:33.065 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:33.065 EAL: Ask a virtual area of 0x61000 bytes 00:04:33.065 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:33.065 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:33.065 EAL: Ask a virtual area of 0x400000000 bytes 00:04:33.065 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:33.065 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:33.065 EAL: Hugepages will be freed exactly as allocated. 00:04:33.065 EAL: No shared files mode enabled, IPC is disabled 00:04:33.065 EAL: No shared files mode enabled, IPC is disabled 00:04:33.065 EAL: TSC frequency is ~2500000 KHz 00:04:33.065 EAL: Main lcore 0 is ready (tid=7f152753ba00;cpuset=[0]) 00:04:33.065 EAL: Trying to obtain current memory policy. 00:04:33.065 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.065 EAL: Restoring previous memory policy: 0 00:04:33.065 EAL: request: mp_malloc_sync 00:04:33.065 EAL: No shared files mode enabled, IPC is disabled 00:04:33.065 EAL: Heap on socket 0 was expanded by 2MB 00:04:33.065 EAL: No shared files mode enabled, IPC is disabled 00:04:33.065 EAL: Mem event callback 'spdk:(nil)' registered 00:04:33.065 00:04:33.065 00:04:33.065 CUnit - A unit testing framework for C - Version 2.1-3 00:04:33.065 http://cunit.sourceforge.net/ 00:04:33.065 00:04:33.065 00:04:33.065 Suite: components_suite 00:04:33.066 Test: vtophys_malloc_test ...passed 00:04:33.066 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 4MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 4MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 6MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 6MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 10MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 10MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 18MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 18MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 34MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 34MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 66MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 66MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.066 EAL: Restoring previous memory policy: 4 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was expanded by 130MB 00:04:33.066 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.066 EAL: request: mp_malloc_sync 00:04:33.066 EAL: No shared files mode enabled, IPC is disabled 00:04:33.066 EAL: Heap on socket 0 was shrunk by 130MB 00:04:33.066 EAL: Trying to obtain current memory policy. 00:04:33.066 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.326 EAL: Restoring previous memory policy: 4 00:04:33.326 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.326 EAL: request: mp_malloc_sync 00:04:33.326 EAL: No shared files mode enabled, IPC is disabled 00:04:33.326 EAL: Heap on socket 0 was expanded by 258MB 00:04:33.326 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.326 EAL: request: mp_malloc_sync 00:04:33.326 EAL: No shared files mode enabled, IPC is disabled 00:04:33.326 EAL: Heap on socket 0 was shrunk by 258MB 00:04:33.326 EAL: Trying to obtain current memory policy. 00:04:33.326 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.326 EAL: Restoring previous memory policy: 4 00:04:33.326 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.326 EAL: request: mp_malloc_sync 00:04:33.326 EAL: No shared files mode enabled, IPC is disabled 00:04:33.326 EAL: Heap on socket 0 was expanded by 514MB 00:04:33.586 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.586 EAL: request: mp_malloc_sync 00:04:33.586 EAL: No shared files mode enabled, IPC is disabled 00:04:33.586 EAL: Heap on socket 0 was shrunk by 514MB 00:04:33.586 EAL: Trying to obtain current memory policy. 00:04:33.586 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:33.846 EAL: Restoring previous memory policy: 4 00:04:33.846 EAL: Calling mem event callback 'spdk:(nil)' 00:04:33.846 EAL: request: mp_malloc_sync 00:04:33.846 EAL: No shared files mode enabled, IPC is disabled 00:04:33.846 EAL: Heap on socket 0 was expanded by 1026MB 00:04:33.846 EAL: Calling mem event callback 'spdk:(nil)' 00:04:34.106 EAL: request: mp_malloc_sync 00:04:34.106 EAL: No shared files mode enabled, IPC is disabled 00:04:34.106 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:34.106 passed 00:04:34.106 00:04:34.106 Run Summary: Type Total Ran Passed Failed Inactive 00:04:34.106 suites 1 1 n/a 0 0 00:04:34.106 tests 2 2 2 0 0 00:04:34.106 asserts 497 497 497 0 n/a 00:04:34.106 00:04:34.106 Elapsed time = 0.957 seconds 00:04:34.106 EAL: Calling mem event callback 'spdk:(nil)' 00:04:34.106 EAL: request: mp_malloc_sync 00:04:34.106 EAL: No shared files mode enabled, IPC is disabled 00:04:34.106 EAL: Heap on socket 0 was shrunk by 2MB 00:04:34.106 EAL: No shared files mode enabled, IPC is disabled 00:04:34.106 EAL: No shared files mode enabled, IPC is disabled 00:04:34.106 EAL: No shared files mode enabled, IPC is disabled 00:04:34.106 00:04:34.106 real 0m1.073s 00:04:34.106 user 0m0.621s 00:04:34.106 sys 0m0.426s 00:04:34.106 15:20:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:34.106 15:20:19 -- common/autotest_common.sh@10 -- # set +x 00:04:34.106 ************************************ 00:04:34.106 END TEST env_vtophys 00:04:34.106 ************************************ 00:04:34.106 15:20:19 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:34.106 15:20:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:34.106 15:20:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:34.106 15:20:19 -- common/autotest_common.sh@10 -- # set +x 00:04:34.106 ************************************ 00:04:34.106 START TEST env_pci 00:04:34.106 ************************************ 00:04:34.106 15:20:19 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:34.106 00:04:34.106 00:04:34.106 CUnit - A unit testing framework for C - Version 2.1-3 00:04:34.106 http://cunit.sourceforge.net/ 00:04:34.106 00:04:34.106 00:04:34.106 Suite: pci 00:04:34.106 Test: pci_hook ...[2024-12-02 15:20:19.815150] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1570498 has claimed it 00:04:34.107 EAL: Cannot find device (10000:00:01.0) 00:04:34.107 EAL: Failed to attach device on primary process 00:04:34.107 passed 00:04:34.107 00:04:34.107 Run Summary: Type Total Ran Passed Failed Inactive 00:04:34.107 suites 1 1 n/a 0 0 00:04:34.107 tests 1 1 1 0 0 00:04:34.107 asserts 25 25 25 0 n/a 00:04:34.107 00:04:34.107 Elapsed time = 0.036 seconds 00:04:34.107 00:04:34.107 real 0m0.055s 00:04:34.107 user 0m0.017s 00:04:34.107 sys 0m0.037s 00:04:34.107 15:20:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:34.107 15:20:19 -- common/autotest_common.sh@10 -- # set +x 00:04:34.107 ************************************ 00:04:34.107 END TEST env_pci 00:04:34.107 ************************************ 00:04:34.107 15:20:19 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:34.107 15:20:19 -- env/env.sh@15 -- # uname 00:04:34.107 15:20:19 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:34.107 15:20:19 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:34.107 15:20:19 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:34.107 15:20:19 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:04:34.107 15:20:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:34.107 15:20:19 -- common/autotest_common.sh@10 -- # set +x 00:04:34.107 ************************************ 00:04:34.107 START TEST env_dpdk_post_init 00:04:34.107 ************************************ 00:04:34.107 15:20:19 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:34.367 EAL: Detected CPU lcores: 112 00:04:34.367 EAL: Detected NUMA nodes: 2 00:04:34.367 EAL: Detected static linkage of DPDK 00:04:34.367 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:34.367 EAL: Selected IOVA mode 'VA' 00:04:34.367 EAL: No free 2048 kB hugepages reported on node 1 00:04:34.367 EAL: VFIO support initialized 00:04:34.367 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:34.367 EAL: Using IOMMU type 1 (Type 1) 00:04:35.305 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:04:38.594 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:04:38.594 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:04:38.853 Starting DPDK initialization... 00:04:38.853 Starting SPDK post initialization... 00:04:38.853 SPDK NVMe probe 00:04:38.853 Attaching to 0000:d8:00.0 00:04:38.853 Attached to 0000:d8:00.0 00:04:38.853 Cleaning up... 00:04:38.853 00:04:38.853 real 0m4.756s 00:04:38.853 user 0m3.598s 00:04:38.853 sys 0m0.403s 00:04:38.853 15:20:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:38.853 15:20:24 -- common/autotest_common.sh@10 -- # set +x 00:04:38.853 ************************************ 00:04:38.853 END TEST env_dpdk_post_init 00:04:38.853 ************************************ 00:04:39.112 15:20:24 -- env/env.sh@26 -- # uname 00:04:39.112 15:20:24 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:39.112 15:20:24 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:39.112 15:20:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:39.112 15:20:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:39.112 15:20:24 -- common/autotest_common.sh@10 -- # set +x 00:04:39.112 ************************************ 00:04:39.112 START TEST env_mem_callbacks 00:04:39.112 ************************************ 00:04:39.112 15:20:24 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:39.112 EAL: Detected CPU lcores: 112 00:04:39.112 EAL: Detected NUMA nodes: 2 00:04:39.112 EAL: Detected static linkage of DPDK 00:04:39.112 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:39.112 EAL: Selected IOVA mode 'VA' 00:04:39.112 EAL: No free 2048 kB hugepages reported on node 1 00:04:39.112 EAL: VFIO support initialized 00:04:39.112 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:39.112 00:04:39.112 00:04:39.112 CUnit - A unit testing framework for C - Version 2.1-3 00:04:39.112 http://cunit.sourceforge.net/ 00:04:39.112 00:04:39.112 00:04:39.112 Suite: memory 00:04:39.112 Test: test ... 00:04:39.112 register 0x200000200000 2097152 00:04:39.112 malloc 3145728 00:04:39.112 register 0x200000400000 4194304 00:04:39.112 buf 0x200000500000 len 3145728 PASSED 00:04:39.112 malloc 64 00:04:39.112 buf 0x2000004fff40 len 64 PASSED 00:04:39.112 malloc 4194304 00:04:39.112 register 0x200000800000 6291456 00:04:39.112 buf 0x200000a00000 len 4194304 PASSED 00:04:39.112 free 0x200000500000 3145728 00:04:39.112 free 0x2000004fff40 64 00:04:39.112 unregister 0x200000400000 4194304 PASSED 00:04:39.112 free 0x200000a00000 4194304 00:04:39.112 unregister 0x200000800000 6291456 PASSED 00:04:39.112 malloc 8388608 00:04:39.112 register 0x200000400000 10485760 00:04:39.112 buf 0x200000600000 len 8388608 PASSED 00:04:39.112 free 0x200000600000 8388608 00:04:39.112 unregister 0x200000400000 10485760 PASSED 00:04:39.112 passed 00:04:39.112 00:04:39.112 Run Summary: Type Total Ran Passed Failed Inactive 00:04:39.112 suites 1 1 n/a 0 0 00:04:39.112 tests 1 1 1 0 0 00:04:39.112 asserts 15 15 15 0 n/a 00:04:39.112 00:04:39.112 Elapsed time = 0.005 seconds 00:04:39.112 00:04:39.112 real 0m0.065s 00:04:39.112 user 0m0.015s 00:04:39.112 sys 0m0.050s 00:04:39.112 15:20:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:39.112 15:20:24 -- common/autotest_common.sh@10 -- # set +x 00:04:39.112 ************************************ 00:04:39.112 END TEST env_mem_callbacks 00:04:39.112 ************************************ 00:04:39.112 00:04:39.112 real 0m6.460s 00:04:39.112 user 0m4.506s 00:04:39.112 sys 0m1.229s 00:04:39.112 15:20:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:39.112 15:20:24 -- common/autotest_common.sh@10 -- # set +x 00:04:39.112 ************************************ 00:04:39.112 END TEST env 00:04:39.112 ************************************ 00:04:39.112 15:20:24 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:39.112 15:20:24 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:39.112 15:20:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:39.112 15:20:24 -- common/autotest_common.sh@10 -- # set +x 00:04:39.112 ************************************ 00:04:39.112 START TEST rpc 00:04:39.112 ************************************ 00:04:39.112 15:20:24 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:39.371 * Looking for test storage... 00:04:39.371 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:39.371 15:20:24 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:39.371 15:20:24 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:39.371 15:20:24 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:39.371 15:20:25 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:39.371 15:20:25 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:39.371 15:20:25 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:39.371 15:20:25 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:39.371 15:20:25 -- scripts/common.sh@335 -- # IFS=.-: 00:04:39.371 15:20:25 -- scripts/common.sh@335 -- # read -ra ver1 00:04:39.371 15:20:25 -- scripts/common.sh@336 -- # IFS=.-: 00:04:39.371 15:20:25 -- scripts/common.sh@336 -- # read -ra ver2 00:04:39.371 15:20:25 -- scripts/common.sh@337 -- # local 'op=<' 00:04:39.371 15:20:25 -- scripts/common.sh@339 -- # ver1_l=2 00:04:39.371 15:20:25 -- scripts/common.sh@340 -- # ver2_l=1 00:04:39.371 15:20:25 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:39.371 15:20:25 -- scripts/common.sh@343 -- # case "$op" in 00:04:39.371 15:20:25 -- scripts/common.sh@344 -- # : 1 00:04:39.371 15:20:25 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:39.371 15:20:25 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:39.371 15:20:25 -- scripts/common.sh@364 -- # decimal 1 00:04:39.371 15:20:25 -- scripts/common.sh@352 -- # local d=1 00:04:39.371 15:20:25 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:39.371 15:20:25 -- scripts/common.sh@354 -- # echo 1 00:04:39.371 15:20:25 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:39.371 15:20:25 -- scripts/common.sh@365 -- # decimal 2 00:04:39.371 15:20:25 -- scripts/common.sh@352 -- # local d=2 00:04:39.371 15:20:25 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:39.371 15:20:25 -- scripts/common.sh@354 -- # echo 2 00:04:39.371 15:20:25 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:39.371 15:20:25 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:39.371 15:20:25 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:39.371 15:20:25 -- scripts/common.sh@367 -- # return 0 00:04:39.371 15:20:25 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:39.371 15:20:25 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:39.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.371 --rc genhtml_branch_coverage=1 00:04:39.371 --rc genhtml_function_coverage=1 00:04:39.371 --rc genhtml_legend=1 00:04:39.371 --rc geninfo_all_blocks=1 00:04:39.371 --rc geninfo_unexecuted_blocks=1 00:04:39.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.371 ' 00:04:39.371 15:20:25 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:39.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.371 --rc genhtml_branch_coverage=1 00:04:39.371 --rc genhtml_function_coverage=1 00:04:39.371 --rc genhtml_legend=1 00:04:39.371 --rc geninfo_all_blocks=1 00:04:39.371 --rc geninfo_unexecuted_blocks=1 00:04:39.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.371 ' 00:04:39.371 15:20:25 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:39.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.371 --rc genhtml_branch_coverage=1 00:04:39.371 --rc genhtml_function_coverage=1 00:04:39.371 --rc genhtml_legend=1 00:04:39.371 --rc geninfo_all_blocks=1 00:04:39.371 --rc geninfo_unexecuted_blocks=1 00:04:39.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.371 ' 00:04:39.371 15:20:25 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:39.371 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.371 --rc genhtml_branch_coverage=1 00:04:39.371 --rc genhtml_function_coverage=1 00:04:39.371 --rc genhtml_legend=1 00:04:39.371 --rc geninfo_all_blocks=1 00:04:39.371 --rc geninfo_unexecuted_blocks=1 00:04:39.371 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.371 ' 00:04:39.371 15:20:25 -- rpc/rpc.sh@65 -- # spdk_pid=1571511 00:04:39.371 15:20:25 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:39.371 15:20:25 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:04:39.371 15:20:25 -- rpc/rpc.sh@67 -- # waitforlisten 1571511 00:04:39.371 15:20:25 -- common/autotest_common.sh@829 -- # '[' -z 1571511 ']' 00:04:39.371 15:20:25 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:39.371 15:20:25 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:39.371 15:20:25 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:39.371 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:39.371 15:20:25 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:39.371 15:20:25 -- common/autotest_common.sh@10 -- # set +x 00:04:39.371 [2024-12-02 15:20:25.092241] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:39.371 [2024-12-02 15:20:25.092313] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1571511 ] 00:04:39.371 EAL: No free 2048 kB hugepages reported on node 1 00:04:39.371 [2024-12-02 15:20:25.161201] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:39.629 [2024-12-02 15:20:25.230108] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:39.629 [2024-12-02 15:20:25.230211] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:04:39.629 [2024-12-02 15:20:25.230222] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1571511' to capture a snapshot of events at runtime. 00:04:39.630 [2024-12-02 15:20:25.230231] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1571511 for offline analysis/debug. 00:04:39.630 [2024-12-02 15:20:25.230248] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:40.196 15:20:25 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:40.196 15:20:25 -- common/autotest_common.sh@862 -- # return 0 00:04:40.196 15:20:25 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:40.196 15:20:25 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:40.196 15:20:25 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:04:40.196 15:20:25 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:04:40.196 15:20:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.196 15:20:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.196 15:20:25 -- common/autotest_common.sh@10 -- # set +x 00:04:40.196 ************************************ 00:04:40.196 START TEST rpc_integrity 00:04:40.196 ************************************ 00:04:40.196 15:20:25 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:04:40.196 15:20:25 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:40.196 15:20:25 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.196 15:20:25 -- common/autotest_common.sh@10 -- # set +x 00:04:40.196 15:20:25 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.196 15:20:25 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:40.196 15:20:25 -- rpc/rpc.sh@13 -- # jq length 00:04:40.196 15:20:25 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:40.196 15:20:25 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:40.196 15:20:25 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.196 15:20:25 -- common/autotest_common.sh@10 -- # set +x 00:04:40.196 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.196 15:20:26 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:04:40.196 15:20:26 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:40.196 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.196 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.456 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.456 15:20:26 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:40.456 { 00:04:40.456 "name": "Malloc0", 00:04:40.456 "aliases": [ 00:04:40.456 "50c73f91-b3ce-4c7e-ad7b-d8ae16ab8ae9" 00:04:40.456 ], 00:04:40.457 "product_name": "Malloc disk", 00:04:40.457 "block_size": 512, 00:04:40.457 "num_blocks": 16384, 00:04:40.457 "uuid": "50c73f91-b3ce-4c7e-ad7b-d8ae16ab8ae9", 00:04:40.457 "assigned_rate_limits": { 00:04:40.457 "rw_ios_per_sec": 0, 00:04:40.457 "rw_mbytes_per_sec": 0, 00:04:40.457 "r_mbytes_per_sec": 0, 00:04:40.457 "w_mbytes_per_sec": 0 00:04:40.457 }, 00:04:40.457 "claimed": false, 00:04:40.457 "zoned": false, 00:04:40.457 "supported_io_types": { 00:04:40.457 "read": true, 00:04:40.457 "write": true, 00:04:40.457 "unmap": true, 00:04:40.457 "write_zeroes": true, 00:04:40.457 "flush": true, 00:04:40.457 "reset": true, 00:04:40.457 "compare": false, 00:04:40.457 "compare_and_write": false, 00:04:40.457 "abort": true, 00:04:40.457 "nvme_admin": false, 00:04:40.457 "nvme_io": false 00:04:40.457 }, 00:04:40.457 "memory_domains": [ 00:04:40.457 { 00:04:40.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:40.457 "dma_device_type": 2 00:04:40.457 } 00:04:40.457 ], 00:04:40.457 "driver_specific": {} 00:04:40.457 } 00:04:40.457 ]' 00:04:40.457 15:20:26 -- rpc/rpc.sh@17 -- # jq length 00:04:40.457 15:20:26 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:40.457 15:20:26 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:04:40.457 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 [2024-12-02 15:20:26.076712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:04:40.457 [2024-12-02 15:20:26.076751] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:40.457 [2024-12-02 15:20:26.076773] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5577030 00:04:40.457 [2024-12-02 15:20:26.076784] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:40.457 [2024-12-02 15:20:26.077602] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:40.457 [2024-12-02 15:20:26.077625] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:40.457 Passthru0 00:04:40.457 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.457 15:20:26 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:40.457 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.457 15:20:26 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:40.457 { 00:04:40.457 "name": "Malloc0", 00:04:40.457 "aliases": [ 00:04:40.457 "50c73f91-b3ce-4c7e-ad7b-d8ae16ab8ae9" 00:04:40.457 ], 00:04:40.457 "product_name": "Malloc disk", 00:04:40.457 "block_size": 512, 00:04:40.457 "num_blocks": 16384, 00:04:40.457 "uuid": "50c73f91-b3ce-4c7e-ad7b-d8ae16ab8ae9", 00:04:40.457 "assigned_rate_limits": { 00:04:40.457 "rw_ios_per_sec": 0, 00:04:40.457 "rw_mbytes_per_sec": 0, 00:04:40.457 "r_mbytes_per_sec": 0, 00:04:40.457 "w_mbytes_per_sec": 0 00:04:40.457 }, 00:04:40.457 "claimed": true, 00:04:40.457 "claim_type": "exclusive_write", 00:04:40.457 "zoned": false, 00:04:40.457 "supported_io_types": { 00:04:40.457 "read": true, 00:04:40.457 "write": true, 00:04:40.457 "unmap": true, 00:04:40.457 "write_zeroes": true, 00:04:40.457 "flush": true, 00:04:40.457 "reset": true, 00:04:40.457 "compare": false, 00:04:40.457 "compare_and_write": false, 00:04:40.457 "abort": true, 00:04:40.457 "nvme_admin": false, 00:04:40.457 "nvme_io": false 00:04:40.457 }, 00:04:40.457 "memory_domains": [ 00:04:40.457 { 00:04:40.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:40.457 "dma_device_type": 2 00:04:40.457 } 00:04:40.457 ], 00:04:40.457 "driver_specific": {} 00:04:40.457 }, 00:04:40.457 { 00:04:40.457 "name": "Passthru0", 00:04:40.457 "aliases": [ 00:04:40.457 "cd51979a-1c18-54b3-9bef-b6a7d5e6b387" 00:04:40.457 ], 00:04:40.457 "product_name": "passthru", 00:04:40.457 "block_size": 512, 00:04:40.457 "num_blocks": 16384, 00:04:40.457 "uuid": "cd51979a-1c18-54b3-9bef-b6a7d5e6b387", 00:04:40.457 "assigned_rate_limits": { 00:04:40.457 "rw_ios_per_sec": 0, 00:04:40.457 "rw_mbytes_per_sec": 0, 00:04:40.457 "r_mbytes_per_sec": 0, 00:04:40.457 "w_mbytes_per_sec": 0 00:04:40.457 }, 00:04:40.457 "claimed": false, 00:04:40.457 "zoned": false, 00:04:40.457 "supported_io_types": { 00:04:40.457 "read": true, 00:04:40.457 "write": true, 00:04:40.457 "unmap": true, 00:04:40.457 "write_zeroes": true, 00:04:40.457 "flush": true, 00:04:40.457 "reset": true, 00:04:40.457 "compare": false, 00:04:40.457 "compare_and_write": false, 00:04:40.457 "abort": true, 00:04:40.457 "nvme_admin": false, 00:04:40.457 "nvme_io": false 00:04:40.457 }, 00:04:40.457 "memory_domains": [ 00:04:40.457 { 00:04:40.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:40.457 "dma_device_type": 2 00:04:40.457 } 00:04:40.457 ], 00:04:40.457 "driver_specific": { 00:04:40.457 "passthru": { 00:04:40.457 "name": "Passthru0", 00:04:40.457 "base_bdev_name": "Malloc0" 00:04:40.457 } 00:04:40.457 } 00:04:40.457 } 00:04:40.457 ]' 00:04:40.457 15:20:26 -- rpc/rpc.sh@21 -- # jq length 00:04:40.457 15:20:26 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:40.457 15:20:26 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:40.457 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.457 15:20:26 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:04:40.457 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.457 15:20:26 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:40.457 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.457 15:20:26 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:40.457 15:20:26 -- rpc/rpc.sh@26 -- # jq length 00:04:40.457 15:20:26 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:40.457 00:04:40.457 real 0m0.286s 00:04:40.457 user 0m0.174s 00:04:40.457 sys 0m0.047s 00:04:40.457 15:20:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 ************************************ 00:04:40.457 END TEST rpc_integrity 00:04:40.457 ************************************ 00:04:40.457 15:20:26 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:04:40.457 15:20:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.457 15:20:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.457 ************************************ 00:04:40.457 START TEST rpc_plugins 00:04:40.457 ************************************ 00:04:40.457 15:20:26 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:04:40.457 15:20:26 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:04:40.457 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.457 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.717 15:20:26 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:04:40.717 15:20:26 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:04:40.717 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.717 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.717 15:20:26 -- rpc/rpc.sh@31 -- # bdevs='[ 00:04:40.717 { 00:04:40.717 "name": "Malloc1", 00:04:40.717 "aliases": [ 00:04:40.717 "af686052-3077-4b8d-a996-64c4d6c6e039" 00:04:40.717 ], 00:04:40.717 "product_name": "Malloc disk", 00:04:40.717 "block_size": 4096, 00:04:40.717 "num_blocks": 256, 00:04:40.717 "uuid": "af686052-3077-4b8d-a996-64c4d6c6e039", 00:04:40.717 "assigned_rate_limits": { 00:04:40.717 "rw_ios_per_sec": 0, 00:04:40.717 "rw_mbytes_per_sec": 0, 00:04:40.717 "r_mbytes_per_sec": 0, 00:04:40.717 "w_mbytes_per_sec": 0 00:04:40.717 }, 00:04:40.717 "claimed": false, 00:04:40.717 "zoned": false, 00:04:40.717 "supported_io_types": { 00:04:40.717 "read": true, 00:04:40.717 "write": true, 00:04:40.717 "unmap": true, 00:04:40.717 "write_zeroes": true, 00:04:40.717 "flush": true, 00:04:40.717 "reset": true, 00:04:40.717 "compare": false, 00:04:40.717 "compare_and_write": false, 00:04:40.717 "abort": true, 00:04:40.717 "nvme_admin": false, 00:04:40.717 "nvme_io": false 00:04:40.717 }, 00:04:40.717 "memory_domains": [ 00:04:40.717 { 00:04:40.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:40.717 "dma_device_type": 2 00:04:40.717 } 00:04:40.717 ], 00:04:40.717 "driver_specific": {} 00:04:40.717 } 00:04:40.717 ]' 00:04:40.717 15:20:26 -- rpc/rpc.sh@32 -- # jq length 00:04:40.717 15:20:26 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:04:40.717 15:20:26 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:04:40.717 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.717 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.717 15:20:26 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:04:40.717 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.717 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.717 15:20:26 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:04:40.717 15:20:26 -- rpc/rpc.sh@36 -- # jq length 00:04:40.717 15:20:26 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:04:40.717 00:04:40.717 real 0m0.142s 00:04:40.717 user 0m0.090s 00:04:40.717 sys 0m0.022s 00:04:40.717 15:20:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:40.717 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 ************************************ 00:04:40.717 END TEST rpc_plugins 00:04:40.717 ************************************ 00:04:40.717 15:20:26 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:04:40.717 15:20:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.717 15:20:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.717 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 ************************************ 00:04:40.717 START TEST rpc_trace_cmd_test 00:04:40.717 ************************************ 00:04:40.717 15:20:26 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:04:40.717 15:20:26 -- rpc/rpc.sh@40 -- # local info 00:04:40.717 15:20:26 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:04:40.717 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.717 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.717 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.717 15:20:26 -- rpc/rpc.sh@42 -- # info='{ 00:04:40.717 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1571511", 00:04:40.717 "tpoint_group_mask": "0x8", 00:04:40.717 "iscsi_conn": { 00:04:40.717 "mask": "0x2", 00:04:40.717 "tpoint_mask": "0x0" 00:04:40.717 }, 00:04:40.717 "scsi": { 00:04:40.717 "mask": "0x4", 00:04:40.717 "tpoint_mask": "0x0" 00:04:40.717 }, 00:04:40.717 "bdev": { 00:04:40.717 "mask": "0x8", 00:04:40.717 "tpoint_mask": "0xffffffffffffffff" 00:04:40.717 }, 00:04:40.717 "nvmf_rdma": { 00:04:40.717 "mask": "0x10", 00:04:40.717 "tpoint_mask": "0x0" 00:04:40.717 }, 00:04:40.717 "nvmf_tcp": { 00:04:40.717 "mask": "0x20", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "ftl": { 00:04:40.718 "mask": "0x40", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "blobfs": { 00:04:40.718 "mask": "0x80", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "dsa": { 00:04:40.718 "mask": "0x200", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "thread": { 00:04:40.718 "mask": "0x400", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "nvme_pcie": { 00:04:40.718 "mask": "0x800", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "iaa": { 00:04:40.718 "mask": "0x1000", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "nvme_tcp": { 00:04:40.718 "mask": "0x2000", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 }, 00:04:40.718 "bdev_nvme": { 00:04:40.718 "mask": "0x4000", 00:04:40.718 "tpoint_mask": "0x0" 00:04:40.718 } 00:04:40.718 }' 00:04:40.718 15:20:26 -- rpc/rpc.sh@43 -- # jq length 00:04:40.718 15:20:26 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:04:40.718 15:20:26 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:04:40.977 15:20:26 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:04:40.977 15:20:26 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:04:40.977 15:20:26 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:04:40.977 15:20:26 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:04:40.977 15:20:26 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:04:40.977 15:20:26 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:04:40.977 15:20:26 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:04:40.977 00:04:40.977 real 0m0.189s 00:04:40.977 user 0m0.156s 00:04:40.977 sys 0m0.025s 00:04:40.977 15:20:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:40.977 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.978 ************************************ 00:04:40.978 END TEST rpc_trace_cmd_test 00:04:40.978 ************************************ 00:04:40.978 15:20:26 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:04:40.978 15:20:26 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:04:40.978 15:20:26 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:04:40.978 15:20:26 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:40.978 15:20:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:40.978 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.978 ************************************ 00:04:40.978 START TEST rpc_daemon_integrity 00:04:40.978 ************************************ 00:04:40.978 15:20:26 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:04:40.978 15:20:26 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:40.978 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.978 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.978 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.978 15:20:26 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:40.978 15:20:26 -- rpc/rpc.sh@13 -- # jq length 00:04:40.978 15:20:26 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:40.978 15:20:26 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:40.978 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.978 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.978 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.978 15:20:26 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:04:40.978 15:20:26 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:40.978 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:40.978 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:40.978 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:40.978 15:20:26 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:40.978 { 00:04:40.978 "name": "Malloc2", 00:04:40.978 "aliases": [ 00:04:40.978 "265eea33-8c9a-485b-9f6a-c33304f71734" 00:04:40.978 ], 00:04:40.978 "product_name": "Malloc disk", 00:04:40.978 "block_size": 512, 00:04:40.978 "num_blocks": 16384, 00:04:40.978 "uuid": "265eea33-8c9a-485b-9f6a-c33304f71734", 00:04:40.978 "assigned_rate_limits": { 00:04:40.978 "rw_ios_per_sec": 0, 00:04:40.978 "rw_mbytes_per_sec": 0, 00:04:40.978 "r_mbytes_per_sec": 0, 00:04:40.978 "w_mbytes_per_sec": 0 00:04:40.978 }, 00:04:40.978 "claimed": false, 00:04:40.978 "zoned": false, 00:04:40.978 "supported_io_types": { 00:04:40.978 "read": true, 00:04:40.978 "write": true, 00:04:40.978 "unmap": true, 00:04:40.978 "write_zeroes": true, 00:04:40.978 "flush": true, 00:04:40.978 "reset": true, 00:04:40.978 "compare": false, 00:04:40.978 "compare_and_write": false, 00:04:40.978 "abort": true, 00:04:40.978 "nvme_admin": false, 00:04:40.978 "nvme_io": false 00:04:40.978 }, 00:04:40.978 "memory_domains": [ 00:04:40.978 { 00:04:40.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:40.978 "dma_device_type": 2 00:04:40.978 } 00:04:40.978 ], 00:04:40.978 "driver_specific": {} 00:04:40.978 } 00:04:40.978 ]' 00:04:40.978 15:20:26 -- rpc/rpc.sh@17 -- # jq length 00:04:41.237 15:20:26 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:41.237 15:20:26 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:04:41.237 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:41.237 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:41.237 [2024-12-02 15:20:26.822663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:04:41.237 [2024-12-02 15:20:26.822695] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:41.238 [2024-12-02 15:20:26.822711] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5700980 00:04:41.238 [2024-12-02 15:20:26.822720] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:41.238 [2024-12-02 15:20:26.823421] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:41.238 [2024-12-02 15:20:26.823442] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:41.238 Passthru0 00:04:41.238 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:41.238 15:20:26 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:41.238 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:41.238 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:41.238 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:41.238 15:20:26 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:41.238 { 00:04:41.238 "name": "Malloc2", 00:04:41.238 "aliases": [ 00:04:41.238 "265eea33-8c9a-485b-9f6a-c33304f71734" 00:04:41.238 ], 00:04:41.238 "product_name": "Malloc disk", 00:04:41.238 "block_size": 512, 00:04:41.238 "num_blocks": 16384, 00:04:41.238 "uuid": "265eea33-8c9a-485b-9f6a-c33304f71734", 00:04:41.238 "assigned_rate_limits": { 00:04:41.238 "rw_ios_per_sec": 0, 00:04:41.238 "rw_mbytes_per_sec": 0, 00:04:41.238 "r_mbytes_per_sec": 0, 00:04:41.238 "w_mbytes_per_sec": 0 00:04:41.238 }, 00:04:41.238 "claimed": true, 00:04:41.238 "claim_type": "exclusive_write", 00:04:41.238 "zoned": false, 00:04:41.238 "supported_io_types": { 00:04:41.238 "read": true, 00:04:41.238 "write": true, 00:04:41.238 "unmap": true, 00:04:41.238 "write_zeroes": true, 00:04:41.238 "flush": true, 00:04:41.238 "reset": true, 00:04:41.238 "compare": false, 00:04:41.238 "compare_and_write": false, 00:04:41.238 "abort": true, 00:04:41.238 "nvme_admin": false, 00:04:41.238 "nvme_io": false 00:04:41.238 }, 00:04:41.238 "memory_domains": [ 00:04:41.238 { 00:04:41.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:41.238 "dma_device_type": 2 00:04:41.238 } 00:04:41.238 ], 00:04:41.238 "driver_specific": {} 00:04:41.238 }, 00:04:41.238 { 00:04:41.238 "name": "Passthru0", 00:04:41.238 "aliases": [ 00:04:41.238 "528e8d60-2473-5279-aedd-bb4f9b27b3bd" 00:04:41.238 ], 00:04:41.238 "product_name": "passthru", 00:04:41.238 "block_size": 512, 00:04:41.238 "num_blocks": 16384, 00:04:41.238 "uuid": "528e8d60-2473-5279-aedd-bb4f9b27b3bd", 00:04:41.238 "assigned_rate_limits": { 00:04:41.238 "rw_ios_per_sec": 0, 00:04:41.238 "rw_mbytes_per_sec": 0, 00:04:41.238 "r_mbytes_per_sec": 0, 00:04:41.238 "w_mbytes_per_sec": 0 00:04:41.238 }, 00:04:41.238 "claimed": false, 00:04:41.238 "zoned": false, 00:04:41.238 "supported_io_types": { 00:04:41.238 "read": true, 00:04:41.238 "write": true, 00:04:41.238 "unmap": true, 00:04:41.238 "write_zeroes": true, 00:04:41.238 "flush": true, 00:04:41.238 "reset": true, 00:04:41.238 "compare": false, 00:04:41.238 "compare_and_write": false, 00:04:41.238 "abort": true, 00:04:41.238 "nvme_admin": false, 00:04:41.238 "nvme_io": false 00:04:41.238 }, 00:04:41.238 "memory_domains": [ 00:04:41.238 { 00:04:41.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:41.238 "dma_device_type": 2 00:04:41.238 } 00:04:41.238 ], 00:04:41.238 "driver_specific": { 00:04:41.238 "passthru": { 00:04:41.238 "name": "Passthru0", 00:04:41.238 "base_bdev_name": "Malloc2" 00:04:41.238 } 00:04:41.238 } 00:04:41.238 } 00:04:41.238 ]' 00:04:41.238 15:20:26 -- rpc/rpc.sh@21 -- # jq length 00:04:41.238 15:20:26 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:41.238 15:20:26 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:41.238 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:41.238 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:41.238 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:41.238 15:20:26 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:04:41.238 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:41.238 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:41.238 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:41.238 15:20:26 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:41.238 15:20:26 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:41.238 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:41.238 15:20:26 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:41.238 15:20:26 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:41.238 15:20:26 -- rpc/rpc.sh@26 -- # jq length 00:04:41.238 15:20:26 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:41.238 00:04:41.238 real 0m0.272s 00:04:41.238 user 0m0.150s 00:04:41.238 sys 0m0.059s 00:04:41.238 15:20:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:41.238 15:20:26 -- common/autotest_common.sh@10 -- # set +x 00:04:41.238 ************************************ 00:04:41.238 END TEST rpc_daemon_integrity 00:04:41.238 ************************************ 00:04:41.238 15:20:27 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:04:41.238 15:20:27 -- rpc/rpc.sh@84 -- # killprocess 1571511 00:04:41.238 15:20:27 -- common/autotest_common.sh@936 -- # '[' -z 1571511 ']' 00:04:41.238 15:20:27 -- common/autotest_common.sh@940 -- # kill -0 1571511 00:04:41.238 15:20:27 -- common/autotest_common.sh@941 -- # uname 00:04:41.238 15:20:27 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:41.238 15:20:27 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1571511 00:04:41.497 15:20:27 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:41.497 15:20:27 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:41.497 15:20:27 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1571511' 00:04:41.497 killing process with pid 1571511 00:04:41.497 15:20:27 -- common/autotest_common.sh@955 -- # kill 1571511 00:04:41.497 15:20:27 -- common/autotest_common.sh@960 -- # wait 1571511 00:04:41.756 00:04:41.756 real 0m2.503s 00:04:41.756 user 0m3.093s 00:04:41.756 sys 0m0.769s 00:04:41.756 15:20:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:41.756 15:20:27 -- common/autotest_common.sh@10 -- # set +x 00:04:41.756 ************************************ 00:04:41.756 END TEST rpc 00:04:41.756 ************************************ 00:04:41.756 15:20:27 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:41.756 15:20:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:41.756 15:20:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:41.756 15:20:27 -- common/autotest_common.sh@10 -- # set +x 00:04:41.756 ************************************ 00:04:41.756 START TEST rpc_client 00:04:41.756 ************************************ 00:04:41.756 15:20:27 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:41.756 * Looking for test storage... 00:04:41.756 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:04:41.756 15:20:27 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:41.756 15:20:27 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:41.756 15:20:27 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:41.756 15:20:27 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:41.756 15:20:27 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:41.756 15:20:27 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:41.756 15:20:27 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:41.756 15:20:27 -- scripts/common.sh@335 -- # IFS=.-: 00:04:41.756 15:20:27 -- scripts/common.sh@335 -- # read -ra ver1 00:04:41.756 15:20:27 -- scripts/common.sh@336 -- # IFS=.-: 00:04:41.756 15:20:27 -- scripts/common.sh@336 -- # read -ra ver2 00:04:41.756 15:20:27 -- scripts/common.sh@337 -- # local 'op=<' 00:04:41.756 15:20:27 -- scripts/common.sh@339 -- # ver1_l=2 00:04:41.756 15:20:27 -- scripts/common.sh@340 -- # ver2_l=1 00:04:41.756 15:20:27 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:41.756 15:20:27 -- scripts/common.sh@343 -- # case "$op" in 00:04:41.756 15:20:27 -- scripts/common.sh@344 -- # : 1 00:04:41.756 15:20:27 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:41.756 15:20:27 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:41.756 15:20:27 -- scripts/common.sh@364 -- # decimal 1 00:04:41.756 15:20:27 -- scripts/common.sh@352 -- # local d=1 00:04:41.756 15:20:27 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:41.756 15:20:27 -- scripts/common.sh@354 -- # echo 1 00:04:41.756 15:20:27 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:41.756 15:20:27 -- scripts/common.sh@365 -- # decimal 2 00:04:41.756 15:20:27 -- scripts/common.sh@352 -- # local d=2 00:04:41.756 15:20:27 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:41.756 15:20:27 -- scripts/common.sh@354 -- # echo 2 00:04:41.756 15:20:27 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:41.756 15:20:27 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:41.756 15:20:27 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:41.756 15:20:27 -- scripts/common.sh@367 -- # return 0 00:04:41.756 15:20:27 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:41.756 15:20:27 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:41.756 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.756 --rc genhtml_branch_coverage=1 00:04:41.756 --rc genhtml_function_coverage=1 00:04:41.756 --rc genhtml_legend=1 00:04:41.756 --rc geninfo_all_blocks=1 00:04:41.756 --rc geninfo_unexecuted_blocks=1 00:04:41.756 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.757 ' 00:04:41.757 15:20:27 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:41.757 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.757 --rc genhtml_branch_coverage=1 00:04:41.757 --rc genhtml_function_coverage=1 00:04:41.757 --rc genhtml_legend=1 00:04:41.757 --rc geninfo_all_blocks=1 00:04:41.757 --rc geninfo_unexecuted_blocks=1 00:04:41.757 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.757 ' 00:04:41.757 15:20:27 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:41.757 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.757 --rc genhtml_branch_coverage=1 00:04:41.757 --rc genhtml_function_coverage=1 00:04:41.757 --rc genhtml_legend=1 00:04:41.757 --rc geninfo_all_blocks=1 00:04:41.757 --rc geninfo_unexecuted_blocks=1 00:04:41.757 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.757 ' 00:04:41.757 15:20:27 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:41.757 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.757 --rc genhtml_branch_coverage=1 00:04:41.757 --rc genhtml_function_coverage=1 00:04:41.757 --rc genhtml_legend=1 00:04:41.757 --rc geninfo_all_blocks=1 00:04:41.757 --rc geninfo_unexecuted_blocks=1 00:04:41.757 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.757 ' 00:04:41.757 15:20:27 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:04:42.017 OK 00:04:42.017 15:20:27 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:04:42.017 00:04:42.017 real 0m0.178s 00:04:42.017 user 0m0.092s 00:04:42.017 sys 0m0.095s 00:04:42.017 15:20:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:42.017 15:20:27 -- common/autotest_common.sh@10 -- # set +x 00:04:42.017 ************************************ 00:04:42.017 END TEST rpc_client 00:04:42.017 ************************************ 00:04:42.017 15:20:27 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:04:42.017 15:20:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:42.017 15:20:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:42.017 15:20:27 -- common/autotest_common.sh@10 -- # set +x 00:04:42.017 ************************************ 00:04:42.017 START TEST json_config 00:04:42.017 ************************************ 00:04:42.017 15:20:27 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:04:42.017 15:20:27 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:42.017 15:20:27 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:42.017 15:20:27 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:42.017 15:20:27 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:42.017 15:20:27 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:42.017 15:20:27 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:42.017 15:20:27 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:42.017 15:20:27 -- scripts/common.sh@335 -- # IFS=.-: 00:04:42.017 15:20:27 -- scripts/common.sh@335 -- # read -ra ver1 00:04:42.017 15:20:27 -- scripts/common.sh@336 -- # IFS=.-: 00:04:42.017 15:20:27 -- scripts/common.sh@336 -- # read -ra ver2 00:04:42.017 15:20:27 -- scripts/common.sh@337 -- # local 'op=<' 00:04:42.017 15:20:27 -- scripts/common.sh@339 -- # ver1_l=2 00:04:42.017 15:20:27 -- scripts/common.sh@340 -- # ver2_l=1 00:04:42.017 15:20:27 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:42.017 15:20:27 -- scripts/common.sh@343 -- # case "$op" in 00:04:42.017 15:20:27 -- scripts/common.sh@344 -- # : 1 00:04:42.017 15:20:27 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:42.017 15:20:27 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:42.017 15:20:27 -- scripts/common.sh@364 -- # decimal 1 00:04:42.017 15:20:27 -- scripts/common.sh@352 -- # local d=1 00:04:42.017 15:20:27 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:42.017 15:20:27 -- scripts/common.sh@354 -- # echo 1 00:04:42.017 15:20:27 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:42.017 15:20:27 -- scripts/common.sh@365 -- # decimal 2 00:04:42.017 15:20:27 -- scripts/common.sh@352 -- # local d=2 00:04:42.017 15:20:27 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:42.017 15:20:27 -- scripts/common.sh@354 -- # echo 2 00:04:42.017 15:20:27 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:42.017 15:20:27 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:42.017 15:20:27 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:42.017 15:20:27 -- scripts/common.sh@367 -- # return 0 00:04:42.017 15:20:27 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:42.017 15:20:27 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:42.017 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.017 --rc genhtml_branch_coverage=1 00:04:42.017 --rc genhtml_function_coverage=1 00:04:42.017 --rc genhtml_legend=1 00:04:42.017 --rc geninfo_all_blocks=1 00:04:42.017 --rc geninfo_unexecuted_blocks=1 00:04:42.017 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.018 ' 00:04:42.018 15:20:27 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:42.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.018 --rc genhtml_branch_coverage=1 00:04:42.018 --rc genhtml_function_coverage=1 00:04:42.018 --rc genhtml_legend=1 00:04:42.018 --rc geninfo_all_blocks=1 00:04:42.018 --rc geninfo_unexecuted_blocks=1 00:04:42.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.018 ' 00:04:42.018 15:20:27 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:42.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.018 --rc genhtml_branch_coverage=1 00:04:42.018 --rc genhtml_function_coverage=1 00:04:42.018 --rc genhtml_legend=1 00:04:42.018 --rc geninfo_all_blocks=1 00:04:42.018 --rc geninfo_unexecuted_blocks=1 00:04:42.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.018 ' 00:04:42.018 15:20:27 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:42.018 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.018 --rc genhtml_branch_coverage=1 00:04:42.018 --rc genhtml_function_coverage=1 00:04:42.018 --rc genhtml_legend=1 00:04:42.018 --rc geninfo_all_blocks=1 00:04:42.018 --rc geninfo_unexecuted_blocks=1 00:04:42.018 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.018 ' 00:04:42.018 15:20:27 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:42.018 15:20:27 -- nvmf/common.sh@7 -- # uname -s 00:04:42.018 15:20:27 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:42.018 15:20:27 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:42.018 15:20:27 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:42.018 15:20:27 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:42.018 15:20:27 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:42.018 15:20:27 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:42.018 15:20:27 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:42.018 15:20:27 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:42.018 15:20:27 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:42.018 15:20:27 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:42.018 15:20:27 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:42.018 15:20:27 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:42.018 15:20:27 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:42.018 15:20:27 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:42.018 15:20:27 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:42.018 15:20:27 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:42.018 15:20:27 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:42.018 15:20:27 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:42.018 15:20:27 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:42.018 15:20:27 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.018 15:20:27 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.018 15:20:27 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.018 15:20:27 -- paths/export.sh@5 -- # export PATH 00:04:42.018 15:20:27 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.018 15:20:27 -- nvmf/common.sh@46 -- # : 0 00:04:42.018 15:20:27 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:42.018 15:20:27 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:42.018 15:20:27 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:42.018 15:20:27 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:42.018 15:20:27 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:42.018 15:20:27 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:42.018 15:20:27 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:42.018 15:20:27 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:42.018 15:20:27 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:04:42.018 15:20:27 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:04:42.018 15:20:27 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:04:42.018 15:20:27 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:04:42.018 15:20:27 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:04:42.018 WARNING: No tests are enabled so not running JSON configuration tests 00:04:42.018 15:20:27 -- json_config/json_config.sh@27 -- # exit 0 00:04:42.018 00:04:42.018 real 0m0.165s 00:04:42.018 user 0m0.090s 00:04:42.018 sys 0m0.083s 00:04:42.018 15:20:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:42.018 15:20:27 -- common/autotest_common.sh@10 -- # set +x 00:04:42.018 ************************************ 00:04:42.018 END TEST json_config 00:04:42.018 ************************************ 00:04:42.277 15:20:27 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:04:42.277 15:20:27 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:42.277 15:20:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:42.277 15:20:27 -- common/autotest_common.sh@10 -- # set +x 00:04:42.277 ************************************ 00:04:42.277 START TEST json_config_extra_key 00:04:42.277 ************************************ 00:04:42.277 15:20:27 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:04:42.277 15:20:27 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:42.277 15:20:27 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:42.277 15:20:27 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:42.277 15:20:27 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:42.277 15:20:27 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:42.277 15:20:27 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:42.277 15:20:27 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:42.277 15:20:27 -- scripts/common.sh@335 -- # IFS=.-: 00:04:42.277 15:20:27 -- scripts/common.sh@335 -- # read -ra ver1 00:04:42.277 15:20:27 -- scripts/common.sh@336 -- # IFS=.-: 00:04:42.277 15:20:27 -- scripts/common.sh@336 -- # read -ra ver2 00:04:42.277 15:20:27 -- scripts/common.sh@337 -- # local 'op=<' 00:04:42.277 15:20:27 -- scripts/common.sh@339 -- # ver1_l=2 00:04:42.277 15:20:27 -- scripts/common.sh@340 -- # ver2_l=1 00:04:42.277 15:20:27 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:42.277 15:20:27 -- scripts/common.sh@343 -- # case "$op" in 00:04:42.277 15:20:27 -- scripts/common.sh@344 -- # : 1 00:04:42.277 15:20:27 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:42.277 15:20:27 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:42.277 15:20:27 -- scripts/common.sh@364 -- # decimal 1 00:04:42.277 15:20:27 -- scripts/common.sh@352 -- # local d=1 00:04:42.277 15:20:27 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:42.277 15:20:27 -- scripts/common.sh@354 -- # echo 1 00:04:42.277 15:20:27 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:42.277 15:20:27 -- scripts/common.sh@365 -- # decimal 2 00:04:42.277 15:20:27 -- scripts/common.sh@352 -- # local d=2 00:04:42.277 15:20:28 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:42.277 15:20:28 -- scripts/common.sh@354 -- # echo 2 00:04:42.277 15:20:28 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:42.277 15:20:28 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:42.277 15:20:28 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:42.277 15:20:28 -- scripts/common.sh@367 -- # return 0 00:04:42.277 15:20:28 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:42.277 15:20:28 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:42.277 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.277 --rc genhtml_branch_coverage=1 00:04:42.278 --rc genhtml_function_coverage=1 00:04:42.278 --rc genhtml_legend=1 00:04:42.278 --rc geninfo_all_blocks=1 00:04:42.278 --rc geninfo_unexecuted_blocks=1 00:04:42.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.278 ' 00:04:42.278 15:20:28 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:42.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.278 --rc genhtml_branch_coverage=1 00:04:42.278 --rc genhtml_function_coverage=1 00:04:42.278 --rc genhtml_legend=1 00:04:42.278 --rc geninfo_all_blocks=1 00:04:42.278 --rc geninfo_unexecuted_blocks=1 00:04:42.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.278 ' 00:04:42.278 15:20:28 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:42.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.278 --rc genhtml_branch_coverage=1 00:04:42.278 --rc genhtml_function_coverage=1 00:04:42.278 --rc genhtml_legend=1 00:04:42.278 --rc geninfo_all_blocks=1 00:04:42.278 --rc geninfo_unexecuted_blocks=1 00:04:42.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.278 ' 00:04:42.278 15:20:28 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:42.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:42.278 --rc genhtml_branch_coverage=1 00:04:42.278 --rc genhtml_function_coverage=1 00:04:42.278 --rc genhtml_legend=1 00:04:42.278 --rc geninfo_all_blocks=1 00:04:42.278 --rc geninfo_unexecuted_blocks=1 00:04:42.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:42.278 ' 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:42.278 15:20:28 -- nvmf/common.sh@7 -- # uname -s 00:04:42.278 15:20:28 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:42.278 15:20:28 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:42.278 15:20:28 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:42.278 15:20:28 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:42.278 15:20:28 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:42.278 15:20:28 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:42.278 15:20:28 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:42.278 15:20:28 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:42.278 15:20:28 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:42.278 15:20:28 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:42.278 15:20:28 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:42.278 15:20:28 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:42.278 15:20:28 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:42.278 15:20:28 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:42.278 15:20:28 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:42.278 15:20:28 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:42.278 15:20:28 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:42.278 15:20:28 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:42.278 15:20:28 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:42.278 15:20:28 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.278 15:20:28 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.278 15:20:28 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.278 15:20:28 -- paths/export.sh@5 -- # export PATH 00:04:42.278 15:20:28 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:42.278 15:20:28 -- nvmf/common.sh@46 -- # : 0 00:04:42.278 15:20:28 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:42.278 15:20:28 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:42.278 15:20:28 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:42.278 15:20:28 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:42.278 15:20:28 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:42.278 15:20:28 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:42.278 15:20:28 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:42.278 15:20:28 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:04:42.278 INFO: launching applications... 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@25 -- # shift 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=1572308 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:04:42.278 Waiting for target to run... 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 1572308 /var/tmp/spdk_tgt.sock 00:04:42.278 15:20:28 -- common/autotest_common.sh@829 -- # '[' -z 1572308 ']' 00:04:42.278 15:20:28 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:04:42.278 15:20:28 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:04:42.278 15:20:28 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:42.278 15:20:28 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:04:42.278 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:04:42.278 15:20:28 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:42.278 15:20:28 -- common/autotest_common.sh@10 -- # set +x 00:04:42.278 [2024-12-02 15:20:28.066300] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:42.278 [2024-12-02 15:20:28.066392] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1572308 ] 00:04:42.537 EAL: No free 2048 kB hugepages reported on node 1 00:04:42.797 [2024-12-02 15:20:28.490942] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:42.797 [2024-12-02 15:20:28.579615] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:42.797 [2024-12-02 15:20:28.579706] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:43.366 15:20:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:43.366 15:20:28 -- common/autotest_common.sh@862 -- # return 0 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:04:43.366 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:04:43.366 INFO: shutting down applications... 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 1572308 ]] 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 1572308 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1572308 00:04:43.366 15:20:28 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@50 -- # kill -0 1572308 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@52 -- # break 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:04:43.626 SPDK target shutdown done 00:04:43.626 15:20:29 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:04:43.626 Success 00:04:43.626 00:04:43.626 real 0m1.539s 00:04:43.626 user 0m1.114s 00:04:43.626 sys 0m0.575s 00:04:43.626 15:20:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:43.626 15:20:29 -- common/autotest_common.sh@10 -- # set +x 00:04:43.626 ************************************ 00:04:43.626 END TEST json_config_extra_key 00:04:43.626 ************************************ 00:04:43.626 15:20:29 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:04:43.626 15:20:29 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:43.626 15:20:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:43.626 15:20:29 -- common/autotest_common.sh@10 -- # set +x 00:04:43.626 ************************************ 00:04:43.626 START TEST alias_rpc 00:04:43.626 ************************************ 00:04:43.626 15:20:29 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:04:43.886 * Looking for test storage... 00:04:43.886 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:04:43.886 15:20:29 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:43.886 15:20:29 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:43.886 15:20:29 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:43.886 15:20:29 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:43.886 15:20:29 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:43.886 15:20:29 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:43.886 15:20:29 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:43.886 15:20:29 -- scripts/common.sh@335 -- # IFS=.-: 00:04:43.886 15:20:29 -- scripts/common.sh@335 -- # read -ra ver1 00:04:43.886 15:20:29 -- scripts/common.sh@336 -- # IFS=.-: 00:04:43.886 15:20:29 -- scripts/common.sh@336 -- # read -ra ver2 00:04:43.886 15:20:29 -- scripts/common.sh@337 -- # local 'op=<' 00:04:43.886 15:20:29 -- scripts/common.sh@339 -- # ver1_l=2 00:04:43.886 15:20:29 -- scripts/common.sh@340 -- # ver2_l=1 00:04:43.886 15:20:29 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:43.886 15:20:29 -- scripts/common.sh@343 -- # case "$op" in 00:04:43.886 15:20:29 -- scripts/common.sh@344 -- # : 1 00:04:43.886 15:20:29 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:43.886 15:20:29 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:43.886 15:20:29 -- scripts/common.sh@364 -- # decimal 1 00:04:43.886 15:20:29 -- scripts/common.sh@352 -- # local d=1 00:04:43.886 15:20:29 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:43.886 15:20:29 -- scripts/common.sh@354 -- # echo 1 00:04:43.886 15:20:29 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:43.886 15:20:29 -- scripts/common.sh@365 -- # decimal 2 00:04:43.886 15:20:29 -- scripts/common.sh@352 -- # local d=2 00:04:43.886 15:20:29 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:43.886 15:20:29 -- scripts/common.sh@354 -- # echo 2 00:04:43.886 15:20:29 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:43.886 15:20:29 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:43.886 15:20:29 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:43.886 15:20:29 -- scripts/common.sh@367 -- # return 0 00:04:43.886 15:20:29 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:43.886 15:20:29 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:43.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.886 --rc genhtml_branch_coverage=1 00:04:43.886 --rc genhtml_function_coverage=1 00:04:43.886 --rc genhtml_legend=1 00:04:43.886 --rc geninfo_all_blocks=1 00:04:43.886 --rc geninfo_unexecuted_blocks=1 00:04:43.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.886 ' 00:04:43.886 15:20:29 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:43.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.886 --rc genhtml_branch_coverage=1 00:04:43.886 --rc genhtml_function_coverage=1 00:04:43.886 --rc genhtml_legend=1 00:04:43.886 --rc geninfo_all_blocks=1 00:04:43.886 --rc geninfo_unexecuted_blocks=1 00:04:43.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.886 ' 00:04:43.886 15:20:29 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:43.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.886 --rc genhtml_branch_coverage=1 00:04:43.886 --rc genhtml_function_coverage=1 00:04:43.886 --rc genhtml_legend=1 00:04:43.886 --rc geninfo_all_blocks=1 00:04:43.886 --rc geninfo_unexecuted_blocks=1 00:04:43.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.886 ' 00:04:43.886 15:20:29 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:43.886 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.886 --rc genhtml_branch_coverage=1 00:04:43.886 --rc genhtml_function_coverage=1 00:04:43.886 --rc genhtml_legend=1 00:04:43.886 --rc geninfo_all_blocks=1 00:04:43.886 --rc geninfo_unexecuted_blocks=1 00:04:43.886 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.886 ' 00:04:43.886 15:20:29 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:04:43.886 15:20:29 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1572634 00:04:43.886 15:20:29 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:43.886 15:20:29 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1572634 00:04:43.886 15:20:29 -- common/autotest_common.sh@829 -- # '[' -z 1572634 ']' 00:04:43.886 15:20:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:43.886 15:20:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:43.886 15:20:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:43.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:43.886 15:20:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:43.886 15:20:29 -- common/autotest_common.sh@10 -- # set +x 00:04:43.886 [2024-12-02 15:20:29.649200] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:43.886 [2024-12-02 15:20:29.649284] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1572634 ] 00:04:43.886 EAL: No free 2048 kB hugepages reported on node 1 00:04:44.146 [2024-12-02 15:20:29.714068] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:44.146 [2024-12-02 15:20:29.787181] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:44.146 [2024-12-02 15:20:29.787289] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:44.714 15:20:30 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:44.714 15:20:30 -- common/autotest_common.sh@862 -- # return 0 00:04:44.714 15:20:30 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:04:44.975 15:20:30 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1572634 00:04:44.975 15:20:30 -- common/autotest_common.sh@936 -- # '[' -z 1572634 ']' 00:04:44.975 15:20:30 -- common/autotest_common.sh@940 -- # kill -0 1572634 00:04:44.975 15:20:30 -- common/autotest_common.sh@941 -- # uname 00:04:44.975 15:20:30 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:44.975 15:20:30 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1572634 00:04:44.975 15:20:30 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:44.975 15:20:30 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:44.975 15:20:30 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1572634' 00:04:44.975 killing process with pid 1572634 00:04:44.975 15:20:30 -- common/autotest_common.sh@955 -- # kill 1572634 00:04:44.975 15:20:30 -- common/autotest_common.sh@960 -- # wait 1572634 00:04:45.545 00:04:45.545 real 0m1.622s 00:04:45.545 user 0m1.745s 00:04:45.545 sys 0m0.463s 00:04:45.546 15:20:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:45.546 15:20:31 -- common/autotest_common.sh@10 -- # set +x 00:04:45.546 ************************************ 00:04:45.546 END TEST alias_rpc 00:04:45.546 ************************************ 00:04:45.546 15:20:31 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:04:45.546 15:20:31 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:04:45.546 15:20:31 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:45.546 15:20:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:45.546 15:20:31 -- common/autotest_common.sh@10 -- # set +x 00:04:45.546 ************************************ 00:04:45.546 START TEST spdkcli_tcp 00:04:45.546 ************************************ 00:04:45.546 15:20:31 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:04:45.546 * Looking for test storage... 00:04:45.546 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:04:45.546 15:20:31 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:45.546 15:20:31 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:45.546 15:20:31 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:45.546 15:20:31 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:45.546 15:20:31 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:45.546 15:20:31 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:45.546 15:20:31 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:45.546 15:20:31 -- scripts/common.sh@335 -- # IFS=.-: 00:04:45.546 15:20:31 -- scripts/common.sh@335 -- # read -ra ver1 00:04:45.546 15:20:31 -- scripts/common.sh@336 -- # IFS=.-: 00:04:45.546 15:20:31 -- scripts/common.sh@336 -- # read -ra ver2 00:04:45.546 15:20:31 -- scripts/common.sh@337 -- # local 'op=<' 00:04:45.546 15:20:31 -- scripts/common.sh@339 -- # ver1_l=2 00:04:45.546 15:20:31 -- scripts/common.sh@340 -- # ver2_l=1 00:04:45.546 15:20:31 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:45.546 15:20:31 -- scripts/common.sh@343 -- # case "$op" in 00:04:45.546 15:20:31 -- scripts/common.sh@344 -- # : 1 00:04:45.546 15:20:31 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:45.546 15:20:31 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:45.546 15:20:31 -- scripts/common.sh@364 -- # decimal 1 00:04:45.546 15:20:31 -- scripts/common.sh@352 -- # local d=1 00:04:45.546 15:20:31 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:45.546 15:20:31 -- scripts/common.sh@354 -- # echo 1 00:04:45.546 15:20:31 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:45.546 15:20:31 -- scripts/common.sh@365 -- # decimal 2 00:04:45.546 15:20:31 -- scripts/common.sh@352 -- # local d=2 00:04:45.546 15:20:31 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:45.546 15:20:31 -- scripts/common.sh@354 -- # echo 2 00:04:45.546 15:20:31 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:45.546 15:20:31 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:45.546 15:20:31 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:45.546 15:20:31 -- scripts/common.sh@367 -- # return 0 00:04:45.546 15:20:31 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:45.546 15:20:31 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:45.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.546 --rc genhtml_branch_coverage=1 00:04:45.546 --rc genhtml_function_coverage=1 00:04:45.546 --rc genhtml_legend=1 00:04:45.546 --rc geninfo_all_blocks=1 00:04:45.546 --rc geninfo_unexecuted_blocks=1 00:04:45.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.546 ' 00:04:45.546 15:20:31 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:45.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.546 --rc genhtml_branch_coverage=1 00:04:45.546 --rc genhtml_function_coverage=1 00:04:45.546 --rc genhtml_legend=1 00:04:45.546 --rc geninfo_all_blocks=1 00:04:45.546 --rc geninfo_unexecuted_blocks=1 00:04:45.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.546 ' 00:04:45.546 15:20:31 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:45.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.546 --rc genhtml_branch_coverage=1 00:04:45.546 --rc genhtml_function_coverage=1 00:04:45.546 --rc genhtml_legend=1 00:04:45.546 --rc geninfo_all_blocks=1 00:04:45.546 --rc geninfo_unexecuted_blocks=1 00:04:45.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.546 ' 00:04:45.546 15:20:31 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:45.546 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:45.546 --rc genhtml_branch_coverage=1 00:04:45.546 --rc genhtml_function_coverage=1 00:04:45.546 --rc genhtml_legend=1 00:04:45.546 --rc geninfo_all_blocks=1 00:04:45.546 --rc geninfo_unexecuted_blocks=1 00:04:45.546 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:45.546 ' 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:04:45.546 15:20:31 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:04:45.546 15:20:31 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:04:45.546 15:20:31 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:45.546 15:20:31 -- common/autotest_common.sh@10 -- # set +x 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1572972 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@27 -- # waitforlisten 1572972 00:04:45.546 15:20:31 -- common/autotest_common.sh@829 -- # '[' -z 1572972 ']' 00:04:45.546 15:20:31 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:04:45.546 15:20:31 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:45.546 15:20:31 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:45.546 15:20:31 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:45.546 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:45.546 15:20:31 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:45.546 15:20:31 -- common/autotest_common.sh@10 -- # set +x 00:04:45.546 [2024-12-02 15:20:31.300312] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:45.546 [2024-12-02 15:20:31.300379] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1572972 ] 00:04:45.546 EAL: No free 2048 kB hugepages reported on node 1 00:04:45.806 [2024-12-02 15:20:31.366342] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:45.806 [2024-12-02 15:20:31.435228] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:45.806 [2024-12-02 15:20:31.435399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:45.806 [2024-12-02 15:20:31.435401] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:46.372 15:20:32 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:46.372 15:20:32 -- common/autotest_common.sh@862 -- # return 0 00:04:46.372 15:20:32 -- spdkcli/tcp.sh@31 -- # socat_pid=1573199 00:04:46.372 15:20:32 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:04:46.372 15:20:32 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:04:46.630 [ 00:04:46.630 "spdk_get_version", 00:04:46.630 "rpc_get_methods", 00:04:46.630 "trace_get_info", 00:04:46.630 "trace_get_tpoint_group_mask", 00:04:46.630 "trace_disable_tpoint_group", 00:04:46.630 "trace_enable_tpoint_group", 00:04:46.630 "trace_clear_tpoint_mask", 00:04:46.630 "trace_set_tpoint_mask", 00:04:46.630 "vfu_tgt_set_base_path", 00:04:46.630 "framework_get_pci_devices", 00:04:46.630 "framework_get_config", 00:04:46.630 "framework_get_subsystems", 00:04:46.630 "iobuf_get_stats", 00:04:46.630 "iobuf_set_options", 00:04:46.630 "sock_set_default_impl", 00:04:46.630 "sock_impl_set_options", 00:04:46.630 "sock_impl_get_options", 00:04:46.630 "vmd_rescan", 00:04:46.630 "vmd_remove_device", 00:04:46.630 "vmd_enable", 00:04:46.630 "accel_get_stats", 00:04:46.630 "accel_set_options", 00:04:46.630 "accel_set_driver", 00:04:46.630 "accel_crypto_key_destroy", 00:04:46.630 "accel_crypto_keys_get", 00:04:46.630 "accel_crypto_key_create", 00:04:46.630 "accel_assign_opc", 00:04:46.630 "accel_get_module_info", 00:04:46.630 "accel_get_opc_assignments", 00:04:46.630 "notify_get_notifications", 00:04:46.630 "notify_get_types", 00:04:46.630 "bdev_get_histogram", 00:04:46.630 "bdev_enable_histogram", 00:04:46.630 "bdev_set_qos_limit", 00:04:46.630 "bdev_set_qd_sampling_period", 00:04:46.630 "bdev_get_bdevs", 00:04:46.630 "bdev_reset_iostat", 00:04:46.630 "bdev_get_iostat", 00:04:46.630 "bdev_examine", 00:04:46.630 "bdev_wait_for_examine", 00:04:46.630 "bdev_set_options", 00:04:46.630 "scsi_get_devices", 00:04:46.630 "thread_set_cpumask", 00:04:46.630 "framework_get_scheduler", 00:04:46.630 "framework_set_scheduler", 00:04:46.630 "framework_get_reactors", 00:04:46.630 "thread_get_io_channels", 00:04:46.630 "thread_get_pollers", 00:04:46.630 "thread_get_stats", 00:04:46.630 "framework_monitor_context_switch", 00:04:46.630 "spdk_kill_instance", 00:04:46.630 "log_enable_timestamps", 00:04:46.630 "log_get_flags", 00:04:46.630 "log_clear_flag", 00:04:46.630 "log_set_flag", 00:04:46.630 "log_get_level", 00:04:46.630 "log_set_level", 00:04:46.630 "log_get_print_level", 00:04:46.630 "log_set_print_level", 00:04:46.630 "framework_enable_cpumask_locks", 00:04:46.630 "framework_disable_cpumask_locks", 00:04:46.630 "framework_wait_init", 00:04:46.630 "framework_start_init", 00:04:46.630 "virtio_blk_create_transport", 00:04:46.630 "virtio_blk_get_transports", 00:04:46.630 "vhost_controller_set_coalescing", 00:04:46.630 "vhost_get_controllers", 00:04:46.630 "vhost_delete_controller", 00:04:46.630 "vhost_create_blk_controller", 00:04:46.630 "vhost_scsi_controller_remove_target", 00:04:46.630 "vhost_scsi_controller_add_target", 00:04:46.630 "vhost_start_scsi_controller", 00:04:46.630 "vhost_create_scsi_controller", 00:04:46.630 "ublk_recover_disk", 00:04:46.630 "ublk_get_disks", 00:04:46.630 "ublk_stop_disk", 00:04:46.630 "ublk_start_disk", 00:04:46.630 "ublk_destroy_target", 00:04:46.630 "ublk_create_target", 00:04:46.630 "nbd_get_disks", 00:04:46.630 "nbd_stop_disk", 00:04:46.630 "nbd_start_disk", 00:04:46.630 "env_dpdk_get_mem_stats", 00:04:46.630 "nvmf_subsystem_get_listeners", 00:04:46.630 "nvmf_subsystem_get_qpairs", 00:04:46.630 "nvmf_subsystem_get_controllers", 00:04:46.630 "nvmf_get_stats", 00:04:46.630 "nvmf_get_transports", 00:04:46.630 "nvmf_create_transport", 00:04:46.630 "nvmf_get_targets", 00:04:46.630 "nvmf_delete_target", 00:04:46.630 "nvmf_create_target", 00:04:46.630 "nvmf_subsystem_allow_any_host", 00:04:46.630 "nvmf_subsystem_remove_host", 00:04:46.630 "nvmf_subsystem_add_host", 00:04:46.630 "nvmf_subsystem_remove_ns", 00:04:46.630 "nvmf_subsystem_add_ns", 00:04:46.630 "nvmf_subsystem_listener_set_ana_state", 00:04:46.630 "nvmf_discovery_get_referrals", 00:04:46.630 "nvmf_discovery_remove_referral", 00:04:46.630 "nvmf_discovery_add_referral", 00:04:46.630 "nvmf_subsystem_remove_listener", 00:04:46.630 "nvmf_subsystem_add_listener", 00:04:46.630 "nvmf_delete_subsystem", 00:04:46.630 "nvmf_create_subsystem", 00:04:46.630 "nvmf_get_subsystems", 00:04:46.630 "nvmf_set_crdt", 00:04:46.630 "nvmf_set_config", 00:04:46.630 "nvmf_set_max_subsystems", 00:04:46.630 "iscsi_set_options", 00:04:46.630 "iscsi_get_auth_groups", 00:04:46.630 "iscsi_auth_group_remove_secret", 00:04:46.630 "iscsi_auth_group_add_secret", 00:04:46.630 "iscsi_delete_auth_group", 00:04:46.630 "iscsi_create_auth_group", 00:04:46.630 "iscsi_set_discovery_auth", 00:04:46.630 "iscsi_get_options", 00:04:46.630 "iscsi_target_node_request_logout", 00:04:46.630 "iscsi_target_node_set_redirect", 00:04:46.630 "iscsi_target_node_set_auth", 00:04:46.630 "iscsi_target_node_add_lun", 00:04:46.630 "iscsi_get_connections", 00:04:46.630 "iscsi_portal_group_set_auth", 00:04:46.630 "iscsi_start_portal_group", 00:04:46.630 "iscsi_delete_portal_group", 00:04:46.630 "iscsi_create_portal_group", 00:04:46.630 "iscsi_get_portal_groups", 00:04:46.630 "iscsi_delete_target_node", 00:04:46.630 "iscsi_target_node_remove_pg_ig_maps", 00:04:46.630 "iscsi_target_node_add_pg_ig_maps", 00:04:46.630 "iscsi_create_target_node", 00:04:46.630 "iscsi_get_target_nodes", 00:04:46.630 "iscsi_delete_initiator_group", 00:04:46.630 "iscsi_initiator_group_remove_initiators", 00:04:46.630 "iscsi_initiator_group_add_initiators", 00:04:46.630 "iscsi_create_initiator_group", 00:04:46.630 "iscsi_get_initiator_groups", 00:04:46.630 "vfu_virtio_create_scsi_endpoint", 00:04:46.630 "vfu_virtio_scsi_remove_target", 00:04:46.630 "vfu_virtio_scsi_add_target", 00:04:46.630 "vfu_virtio_create_blk_endpoint", 00:04:46.630 "vfu_virtio_delete_endpoint", 00:04:46.630 "iaa_scan_accel_module", 00:04:46.630 "dsa_scan_accel_module", 00:04:46.630 "ioat_scan_accel_module", 00:04:46.630 "accel_error_inject_error", 00:04:46.630 "bdev_iscsi_delete", 00:04:46.630 "bdev_iscsi_create", 00:04:46.630 "bdev_iscsi_set_options", 00:04:46.630 "bdev_virtio_attach_controller", 00:04:46.630 "bdev_virtio_scsi_get_devices", 00:04:46.630 "bdev_virtio_detach_controller", 00:04:46.630 "bdev_virtio_blk_set_hotplug", 00:04:46.630 "bdev_ftl_set_property", 00:04:46.630 "bdev_ftl_get_properties", 00:04:46.630 "bdev_ftl_get_stats", 00:04:46.630 "bdev_ftl_unmap", 00:04:46.630 "bdev_ftl_unload", 00:04:46.630 "bdev_ftl_delete", 00:04:46.630 "bdev_ftl_load", 00:04:46.630 "bdev_ftl_create", 00:04:46.630 "bdev_aio_delete", 00:04:46.630 "bdev_aio_rescan", 00:04:46.630 "bdev_aio_create", 00:04:46.630 "blobfs_create", 00:04:46.630 "blobfs_detect", 00:04:46.630 "blobfs_set_cache_size", 00:04:46.630 "bdev_zone_block_delete", 00:04:46.630 "bdev_zone_block_create", 00:04:46.630 "bdev_delay_delete", 00:04:46.630 "bdev_delay_create", 00:04:46.630 "bdev_delay_update_latency", 00:04:46.630 "bdev_split_delete", 00:04:46.630 "bdev_split_create", 00:04:46.630 "bdev_error_inject_error", 00:04:46.631 "bdev_error_delete", 00:04:46.631 "bdev_error_create", 00:04:46.631 "bdev_raid_set_options", 00:04:46.631 "bdev_raid_remove_base_bdev", 00:04:46.631 "bdev_raid_add_base_bdev", 00:04:46.631 "bdev_raid_delete", 00:04:46.631 "bdev_raid_create", 00:04:46.631 "bdev_raid_get_bdevs", 00:04:46.631 "bdev_lvol_grow_lvstore", 00:04:46.631 "bdev_lvol_get_lvols", 00:04:46.631 "bdev_lvol_get_lvstores", 00:04:46.631 "bdev_lvol_delete", 00:04:46.631 "bdev_lvol_set_read_only", 00:04:46.631 "bdev_lvol_resize", 00:04:46.631 "bdev_lvol_decouple_parent", 00:04:46.631 "bdev_lvol_inflate", 00:04:46.631 "bdev_lvol_rename", 00:04:46.631 "bdev_lvol_clone_bdev", 00:04:46.631 "bdev_lvol_clone", 00:04:46.631 "bdev_lvol_snapshot", 00:04:46.631 "bdev_lvol_create", 00:04:46.631 "bdev_lvol_delete_lvstore", 00:04:46.631 "bdev_lvol_rename_lvstore", 00:04:46.631 "bdev_lvol_create_lvstore", 00:04:46.631 "bdev_passthru_delete", 00:04:46.631 "bdev_passthru_create", 00:04:46.631 "bdev_nvme_cuse_unregister", 00:04:46.631 "bdev_nvme_cuse_register", 00:04:46.631 "bdev_opal_new_user", 00:04:46.631 "bdev_opal_set_lock_state", 00:04:46.631 "bdev_opal_delete", 00:04:46.631 "bdev_opal_get_info", 00:04:46.631 "bdev_opal_create", 00:04:46.631 "bdev_nvme_opal_revert", 00:04:46.631 "bdev_nvme_opal_init", 00:04:46.631 "bdev_nvme_send_cmd", 00:04:46.631 "bdev_nvme_get_path_iostat", 00:04:46.631 "bdev_nvme_get_mdns_discovery_info", 00:04:46.631 "bdev_nvme_stop_mdns_discovery", 00:04:46.631 "bdev_nvme_start_mdns_discovery", 00:04:46.631 "bdev_nvme_set_multipath_policy", 00:04:46.631 "bdev_nvme_set_preferred_path", 00:04:46.631 "bdev_nvme_get_io_paths", 00:04:46.631 "bdev_nvme_remove_error_injection", 00:04:46.631 "bdev_nvme_add_error_injection", 00:04:46.631 "bdev_nvme_get_discovery_info", 00:04:46.631 "bdev_nvme_stop_discovery", 00:04:46.631 "bdev_nvme_start_discovery", 00:04:46.631 "bdev_nvme_get_controller_health_info", 00:04:46.631 "bdev_nvme_disable_controller", 00:04:46.631 "bdev_nvme_enable_controller", 00:04:46.631 "bdev_nvme_reset_controller", 00:04:46.631 "bdev_nvme_get_transport_statistics", 00:04:46.631 "bdev_nvme_apply_firmware", 00:04:46.631 "bdev_nvme_detach_controller", 00:04:46.631 "bdev_nvme_get_controllers", 00:04:46.631 "bdev_nvme_attach_controller", 00:04:46.631 "bdev_nvme_set_hotplug", 00:04:46.631 "bdev_nvme_set_options", 00:04:46.631 "bdev_null_resize", 00:04:46.631 "bdev_null_delete", 00:04:46.631 "bdev_null_create", 00:04:46.631 "bdev_malloc_delete", 00:04:46.631 "bdev_malloc_create" 00:04:46.631 ] 00:04:46.631 15:20:32 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:04:46.631 15:20:32 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:46.631 15:20:32 -- common/autotest_common.sh@10 -- # set +x 00:04:46.631 15:20:32 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:04:46.631 15:20:32 -- spdkcli/tcp.sh@38 -- # killprocess 1572972 00:04:46.631 15:20:32 -- common/autotest_common.sh@936 -- # '[' -z 1572972 ']' 00:04:46.631 15:20:32 -- common/autotest_common.sh@940 -- # kill -0 1572972 00:04:46.631 15:20:32 -- common/autotest_common.sh@941 -- # uname 00:04:46.631 15:20:32 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:46.631 15:20:32 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1572972 00:04:46.631 15:20:32 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:46.631 15:20:32 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:46.631 15:20:32 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1572972' 00:04:46.631 killing process with pid 1572972 00:04:46.631 15:20:32 -- common/autotest_common.sh@955 -- # kill 1572972 00:04:46.631 15:20:32 -- common/autotest_common.sh@960 -- # wait 1572972 00:04:46.889 00:04:46.889 real 0m1.602s 00:04:46.889 user 0m2.914s 00:04:46.889 sys 0m0.499s 00:04:46.889 15:20:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:46.889 15:20:32 -- common/autotest_common.sh@10 -- # set +x 00:04:46.889 ************************************ 00:04:46.889 END TEST spdkcli_tcp 00:04:46.889 ************************************ 00:04:47.148 15:20:32 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:04:47.148 15:20:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:47.148 15:20:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:47.148 15:20:32 -- common/autotest_common.sh@10 -- # set +x 00:04:47.148 ************************************ 00:04:47.148 START TEST dpdk_mem_utility 00:04:47.148 ************************************ 00:04:47.148 15:20:32 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:04:47.148 * Looking for test storage... 00:04:47.148 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:04:47.148 15:20:32 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:47.148 15:20:32 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:47.148 15:20:32 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:47.148 15:20:32 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:47.148 15:20:32 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:47.148 15:20:32 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:47.148 15:20:32 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:47.148 15:20:32 -- scripts/common.sh@335 -- # IFS=.-: 00:04:47.148 15:20:32 -- scripts/common.sh@335 -- # read -ra ver1 00:04:47.148 15:20:32 -- scripts/common.sh@336 -- # IFS=.-: 00:04:47.148 15:20:32 -- scripts/common.sh@336 -- # read -ra ver2 00:04:47.148 15:20:32 -- scripts/common.sh@337 -- # local 'op=<' 00:04:47.148 15:20:32 -- scripts/common.sh@339 -- # ver1_l=2 00:04:47.148 15:20:32 -- scripts/common.sh@340 -- # ver2_l=1 00:04:47.148 15:20:32 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:47.148 15:20:32 -- scripts/common.sh@343 -- # case "$op" in 00:04:47.148 15:20:32 -- scripts/common.sh@344 -- # : 1 00:04:47.148 15:20:32 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:47.148 15:20:32 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:47.148 15:20:32 -- scripts/common.sh@364 -- # decimal 1 00:04:47.148 15:20:32 -- scripts/common.sh@352 -- # local d=1 00:04:47.148 15:20:32 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:47.148 15:20:32 -- scripts/common.sh@354 -- # echo 1 00:04:47.149 15:20:32 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:47.149 15:20:32 -- scripts/common.sh@365 -- # decimal 2 00:04:47.149 15:20:32 -- scripts/common.sh@352 -- # local d=2 00:04:47.149 15:20:32 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:47.149 15:20:32 -- scripts/common.sh@354 -- # echo 2 00:04:47.149 15:20:32 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:47.149 15:20:32 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:47.149 15:20:32 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:47.149 15:20:32 -- scripts/common.sh@367 -- # return 0 00:04:47.149 15:20:32 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:47.149 15:20:32 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:47.149 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.149 --rc genhtml_branch_coverage=1 00:04:47.149 --rc genhtml_function_coverage=1 00:04:47.149 --rc genhtml_legend=1 00:04:47.149 --rc geninfo_all_blocks=1 00:04:47.149 --rc geninfo_unexecuted_blocks=1 00:04:47.149 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.149 ' 00:04:47.149 15:20:32 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:47.149 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.149 --rc genhtml_branch_coverage=1 00:04:47.149 --rc genhtml_function_coverage=1 00:04:47.149 --rc genhtml_legend=1 00:04:47.149 --rc geninfo_all_blocks=1 00:04:47.149 --rc geninfo_unexecuted_blocks=1 00:04:47.149 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.149 ' 00:04:47.149 15:20:32 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:47.149 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.149 --rc genhtml_branch_coverage=1 00:04:47.149 --rc genhtml_function_coverage=1 00:04:47.149 --rc genhtml_legend=1 00:04:47.149 --rc geninfo_all_blocks=1 00:04:47.149 --rc geninfo_unexecuted_blocks=1 00:04:47.149 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.149 ' 00:04:47.149 15:20:32 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:47.149 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:47.149 --rc genhtml_branch_coverage=1 00:04:47.149 --rc genhtml_function_coverage=1 00:04:47.149 --rc genhtml_legend=1 00:04:47.149 --rc geninfo_all_blocks=1 00:04:47.149 --rc geninfo_unexecuted_blocks=1 00:04:47.149 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:47.149 ' 00:04:47.149 15:20:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:04:47.149 15:20:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1573315 00:04:47.149 15:20:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1573315 00:04:47.149 15:20:32 -- common/autotest_common.sh@829 -- # '[' -z 1573315 ']' 00:04:47.149 15:20:32 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:47.149 15:20:32 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:47.149 15:20:32 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:47.149 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:47.149 15:20:32 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:47.149 15:20:32 -- common/autotest_common.sh@10 -- # set +x 00:04:47.149 15:20:32 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:47.149 [2024-12-02 15:20:32.949780] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:47.149 [2024-12-02 15:20:32.949870] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573315 ] 00:04:47.408 EAL: No free 2048 kB hugepages reported on node 1 00:04:47.408 [2024-12-02 15:20:33.019277] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:47.408 [2024-12-02 15:20:33.093100] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:47.408 [2024-12-02 15:20:33.093205] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:47.976 15:20:33 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:47.977 15:20:33 -- common/autotest_common.sh@862 -- # return 0 00:04:47.977 15:20:33 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:04:47.977 15:20:33 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:04:47.977 15:20:33 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:47.977 15:20:33 -- common/autotest_common.sh@10 -- # set +x 00:04:47.977 { 00:04:47.977 "filename": "/tmp/spdk_mem_dump.txt" 00:04:47.977 } 00:04:47.977 15:20:33 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:47.977 15:20:33 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:04:48.238 DPDK memory size 814.000000 MiB in 1 heap(s) 00:04:48.238 1 heaps totaling size 814.000000 MiB 00:04:48.238 size: 814.000000 MiB heap id: 0 00:04:48.238 end heaps---------- 00:04:48.238 8 mempools totaling size 598.116089 MiB 00:04:48.238 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:04:48.238 size: 158.602051 MiB name: PDU_data_out_Pool 00:04:48.238 size: 84.521057 MiB name: bdev_io_1573315 00:04:48.238 size: 51.011292 MiB name: evtpool_1573315 00:04:48.238 size: 50.003479 MiB name: msgpool_1573315 00:04:48.238 size: 21.763794 MiB name: PDU_Pool 00:04:48.238 size: 19.513306 MiB name: SCSI_TASK_Pool 00:04:48.238 size: 0.026123 MiB name: Session_Pool 00:04:48.238 end mempools------- 00:04:48.238 6 memzones totaling size 4.142822 MiB 00:04:48.238 size: 1.000366 MiB name: RG_ring_0_1573315 00:04:48.238 size: 1.000366 MiB name: RG_ring_1_1573315 00:04:48.238 size: 1.000366 MiB name: RG_ring_4_1573315 00:04:48.238 size: 1.000366 MiB name: RG_ring_5_1573315 00:04:48.238 size: 0.125366 MiB name: RG_ring_2_1573315 00:04:48.238 size: 0.015991 MiB name: RG_ring_3_1573315 00:04:48.238 end memzones------- 00:04:48.238 15:20:33 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:04:48.238 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:04:48.238 list of free elements. size: 12.519348 MiB 00:04:48.238 element at address: 0x200000400000 with size: 1.999512 MiB 00:04:48.238 element at address: 0x200018e00000 with size: 0.999878 MiB 00:04:48.238 element at address: 0x200019000000 with size: 0.999878 MiB 00:04:48.238 element at address: 0x200003e00000 with size: 0.996277 MiB 00:04:48.238 element at address: 0x200031c00000 with size: 0.994446 MiB 00:04:48.238 element at address: 0x200013800000 with size: 0.978699 MiB 00:04:48.238 element at address: 0x200007000000 with size: 0.959839 MiB 00:04:48.238 element at address: 0x200019200000 with size: 0.936584 MiB 00:04:48.238 element at address: 0x200000200000 with size: 0.841614 MiB 00:04:48.238 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:04:48.238 element at address: 0x20000b200000 with size: 0.490723 MiB 00:04:48.238 element at address: 0x200000800000 with size: 0.487793 MiB 00:04:48.238 element at address: 0x200019400000 with size: 0.485657 MiB 00:04:48.238 element at address: 0x200027e00000 with size: 0.410034 MiB 00:04:48.238 element at address: 0x200003a00000 with size: 0.355530 MiB 00:04:48.238 list of standard malloc elements. size: 199.218079 MiB 00:04:48.238 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:04:48.238 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:04:48.238 element at address: 0x200018efff80 with size: 1.000122 MiB 00:04:48.238 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:04:48.238 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:04:48.238 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:04:48.238 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:04:48.238 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:04:48.238 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:04:48.239 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003adb300 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003adb500 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003affa80 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003affb40 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:04:48.239 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:04:48.239 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200027e69040 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:04:48.239 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:04:48.239 list of memzone associated elements. size: 602.262573 MiB 00:04:48.239 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:04:48.239 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:04:48.239 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:04:48.239 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:04:48.239 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:04:48.239 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1573315_0 00:04:48.239 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:04:48.239 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1573315_0 00:04:48.239 element at address: 0x200003fff380 with size: 48.003052 MiB 00:04:48.239 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1573315_0 00:04:48.239 element at address: 0x2000195be940 with size: 20.255554 MiB 00:04:48.239 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:04:48.239 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:04:48.239 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:04:48.239 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:04:48.239 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1573315 00:04:48.239 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:04:48.239 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1573315 00:04:48.239 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:04:48.239 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1573315 00:04:48.239 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:04:48.239 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:04:48.239 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:04:48.239 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:04:48.239 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:04:48.239 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:04:48.239 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:04:48.239 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:04:48.239 element at address: 0x200003eff180 with size: 1.000488 MiB 00:04:48.239 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1573315 00:04:48.239 element at address: 0x200003affc00 with size: 1.000488 MiB 00:04:48.239 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1573315 00:04:48.239 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:04:48.239 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1573315 00:04:48.239 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:04:48.239 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1573315 00:04:48.239 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:04:48.239 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1573315 00:04:48.239 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:04:48.239 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:04:48.239 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:04:48.239 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:04:48.239 element at address: 0x20001947c540 with size: 0.250488 MiB 00:04:48.239 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:04:48.239 element at address: 0x200003adf880 with size: 0.125488 MiB 00:04:48.239 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1573315 00:04:48.239 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:04:48.239 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:04:48.239 element at address: 0x200027e69100 with size: 0.023743 MiB 00:04:48.239 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:04:48.239 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:04:48.239 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1573315 00:04:48.239 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:04:48.239 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:04:48.239 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:04:48.239 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1573315 00:04:48.239 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:04:48.239 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1573315 00:04:48.239 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:04:48.239 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:04:48.239 15:20:33 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:04:48.239 15:20:33 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1573315 00:04:48.239 15:20:33 -- common/autotest_common.sh@936 -- # '[' -z 1573315 ']' 00:04:48.239 15:20:33 -- common/autotest_common.sh@940 -- # kill -0 1573315 00:04:48.239 15:20:33 -- common/autotest_common.sh@941 -- # uname 00:04:48.239 15:20:33 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:48.239 15:20:33 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1573315 00:04:48.239 15:20:33 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:48.239 15:20:33 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:48.239 15:20:33 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1573315' 00:04:48.239 killing process with pid 1573315 00:04:48.239 15:20:33 -- common/autotest_common.sh@955 -- # kill 1573315 00:04:48.239 15:20:33 -- common/autotest_common.sh@960 -- # wait 1573315 00:04:48.499 00:04:48.499 real 0m1.480s 00:04:48.499 user 0m1.534s 00:04:48.499 sys 0m0.432s 00:04:48.499 15:20:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:48.499 15:20:34 -- common/autotest_common.sh@10 -- # set +x 00:04:48.499 ************************************ 00:04:48.499 END TEST dpdk_mem_utility 00:04:48.499 ************************************ 00:04:48.499 15:20:34 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:04:48.499 15:20:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:48.499 15:20:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:48.499 15:20:34 -- common/autotest_common.sh@10 -- # set +x 00:04:48.499 ************************************ 00:04:48.499 START TEST event 00:04:48.499 ************************************ 00:04:48.499 15:20:34 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:04:48.759 * Looking for test storage... 00:04:48.759 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:04:48.759 15:20:34 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:48.759 15:20:34 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:48.759 15:20:34 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:48.759 15:20:34 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:48.759 15:20:34 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:48.759 15:20:34 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:48.759 15:20:34 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:48.759 15:20:34 -- scripts/common.sh@335 -- # IFS=.-: 00:04:48.759 15:20:34 -- scripts/common.sh@335 -- # read -ra ver1 00:04:48.759 15:20:34 -- scripts/common.sh@336 -- # IFS=.-: 00:04:48.759 15:20:34 -- scripts/common.sh@336 -- # read -ra ver2 00:04:48.759 15:20:34 -- scripts/common.sh@337 -- # local 'op=<' 00:04:48.759 15:20:34 -- scripts/common.sh@339 -- # ver1_l=2 00:04:48.759 15:20:34 -- scripts/common.sh@340 -- # ver2_l=1 00:04:48.759 15:20:34 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:48.759 15:20:34 -- scripts/common.sh@343 -- # case "$op" in 00:04:48.759 15:20:34 -- scripts/common.sh@344 -- # : 1 00:04:48.759 15:20:34 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:48.759 15:20:34 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:48.759 15:20:34 -- scripts/common.sh@364 -- # decimal 1 00:04:48.759 15:20:34 -- scripts/common.sh@352 -- # local d=1 00:04:48.759 15:20:34 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:48.759 15:20:34 -- scripts/common.sh@354 -- # echo 1 00:04:48.759 15:20:34 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:48.759 15:20:34 -- scripts/common.sh@365 -- # decimal 2 00:04:48.759 15:20:34 -- scripts/common.sh@352 -- # local d=2 00:04:48.759 15:20:34 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:48.759 15:20:34 -- scripts/common.sh@354 -- # echo 2 00:04:48.759 15:20:34 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:48.759 15:20:34 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:48.759 15:20:34 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:48.759 15:20:34 -- scripts/common.sh@367 -- # return 0 00:04:48.759 15:20:34 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:48.760 15:20:34 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:48.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:48.760 --rc genhtml_branch_coverage=1 00:04:48.760 --rc genhtml_function_coverage=1 00:04:48.760 --rc genhtml_legend=1 00:04:48.760 --rc geninfo_all_blocks=1 00:04:48.760 --rc geninfo_unexecuted_blocks=1 00:04:48.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:48.760 ' 00:04:48.760 15:20:34 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:48.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:48.760 --rc genhtml_branch_coverage=1 00:04:48.760 --rc genhtml_function_coverage=1 00:04:48.760 --rc genhtml_legend=1 00:04:48.760 --rc geninfo_all_blocks=1 00:04:48.760 --rc geninfo_unexecuted_blocks=1 00:04:48.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:48.760 ' 00:04:48.760 15:20:34 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:48.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:48.760 --rc genhtml_branch_coverage=1 00:04:48.760 --rc genhtml_function_coverage=1 00:04:48.760 --rc genhtml_legend=1 00:04:48.760 --rc geninfo_all_blocks=1 00:04:48.760 --rc geninfo_unexecuted_blocks=1 00:04:48.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:48.760 ' 00:04:48.760 15:20:34 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:48.760 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:48.760 --rc genhtml_branch_coverage=1 00:04:48.760 --rc genhtml_function_coverage=1 00:04:48.760 --rc genhtml_legend=1 00:04:48.760 --rc geninfo_all_blocks=1 00:04:48.760 --rc geninfo_unexecuted_blocks=1 00:04:48.760 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:48.760 ' 00:04:48.760 15:20:34 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:04:48.760 15:20:34 -- bdev/nbd_common.sh@6 -- # set -e 00:04:48.760 15:20:34 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:04:48.760 15:20:34 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:04:48.760 15:20:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:48.760 15:20:34 -- common/autotest_common.sh@10 -- # set +x 00:04:48.760 ************************************ 00:04:48.760 START TEST event_perf 00:04:48.760 ************************************ 00:04:48.760 15:20:34 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:04:48.760 Running I/O for 1 seconds...[2024-12-02 15:20:34.486251] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:48.760 [2024-12-02 15:20:34.486339] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573646 ] 00:04:48.760 EAL: No free 2048 kB hugepages reported on node 1 00:04:48.760 [2024-12-02 15:20:34.558484] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:04:49.019 [2024-12-02 15:20:34.636495] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:49.019 [2024-12-02 15:20:34.636510] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:04:49.019 [2024-12-02 15:20:34.636527] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:04:49.019 [2024-12-02 15:20:34.636529] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:49.956 Running I/O for 1 seconds... 00:04:49.956 lcore 0: 196482 00:04:49.956 lcore 1: 196481 00:04:49.956 lcore 2: 196481 00:04:49.956 lcore 3: 196480 00:04:49.956 done. 00:04:49.956 00:04:49.956 real 0m1.234s 00:04:49.956 user 0m4.146s 00:04:49.956 sys 0m0.084s 00:04:49.956 15:20:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:49.956 15:20:35 -- common/autotest_common.sh@10 -- # set +x 00:04:49.956 ************************************ 00:04:49.956 END TEST event_perf 00:04:49.956 ************************************ 00:04:49.956 15:20:35 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:04:49.956 15:20:35 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:04:49.956 15:20:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:49.956 15:20:35 -- common/autotest_common.sh@10 -- # set +x 00:04:49.956 ************************************ 00:04:49.956 START TEST event_reactor 00:04:49.956 ************************************ 00:04:49.956 15:20:35 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:04:49.956 [2024-12-02 15:20:35.771239] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:49.956 [2024-12-02 15:20:35.771344] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573935 ] 00:04:50.216 EAL: No free 2048 kB hugepages reported on node 1 00:04:50.216 [2024-12-02 15:20:35.842956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:50.216 [2024-12-02 15:20:35.912354] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:51.595 test_start 00:04:51.595 oneshot 00:04:51.595 tick 100 00:04:51.595 tick 100 00:04:51.595 tick 250 00:04:51.595 tick 100 00:04:51.595 tick 100 00:04:51.595 tick 100 00:04:51.595 tick 250 00:04:51.595 tick 500 00:04:51.595 tick 100 00:04:51.595 tick 100 00:04:51.595 tick 250 00:04:51.595 tick 100 00:04:51.595 tick 100 00:04:51.595 test_end 00:04:51.595 00:04:51.595 real 0m1.225s 00:04:51.595 user 0m1.141s 00:04:51.595 sys 0m0.079s 00:04:51.595 15:20:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:51.595 15:20:36 -- common/autotest_common.sh@10 -- # set +x 00:04:51.595 ************************************ 00:04:51.595 END TEST event_reactor 00:04:51.595 ************************************ 00:04:51.596 15:20:37 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:04:51.596 15:20:37 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:04:51.596 15:20:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:51.596 15:20:37 -- common/autotest_common.sh@10 -- # set +x 00:04:51.596 ************************************ 00:04:51.596 START TEST event_reactor_perf 00:04:51.596 ************************************ 00:04:51.596 15:20:37 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:04:51.596 [2024-12-02 15:20:37.046009] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:51.596 [2024-12-02 15:20:37.046132] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1574220 ] 00:04:51.596 EAL: No free 2048 kB hugepages reported on node 1 00:04:51.596 [2024-12-02 15:20:37.116204] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:51.596 [2024-12-02 15:20:37.181090] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:52.535 test_start 00:04:52.535 test_end 00:04:52.535 Performance: 967313 events per second 00:04:52.535 00:04:52.535 real 0m1.217s 00:04:52.535 user 0m1.127s 00:04:52.535 sys 0m0.086s 00:04:52.535 15:20:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:52.535 15:20:38 -- common/autotest_common.sh@10 -- # set +x 00:04:52.535 ************************************ 00:04:52.535 END TEST event_reactor_perf 00:04:52.535 ************************************ 00:04:52.535 15:20:38 -- event/event.sh@49 -- # uname -s 00:04:52.535 15:20:38 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:04:52.535 15:20:38 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:04:52.535 15:20:38 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:52.535 15:20:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:52.535 15:20:38 -- common/autotest_common.sh@10 -- # set +x 00:04:52.535 ************************************ 00:04:52.535 START TEST event_scheduler 00:04:52.535 ************************************ 00:04:52.535 15:20:38 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:04:52.795 * Looking for test storage... 00:04:52.795 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:04:52.795 15:20:38 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:52.795 15:20:38 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:52.795 15:20:38 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:52.795 15:20:38 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:52.795 15:20:38 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:52.795 15:20:38 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:52.795 15:20:38 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:52.795 15:20:38 -- scripts/common.sh@335 -- # IFS=.-: 00:04:52.795 15:20:38 -- scripts/common.sh@335 -- # read -ra ver1 00:04:52.795 15:20:38 -- scripts/common.sh@336 -- # IFS=.-: 00:04:52.795 15:20:38 -- scripts/common.sh@336 -- # read -ra ver2 00:04:52.795 15:20:38 -- scripts/common.sh@337 -- # local 'op=<' 00:04:52.795 15:20:38 -- scripts/common.sh@339 -- # ver1_l=2 00:04:52.795 15:20:38 -- scripts/common.sh@340 -- # ver2_l=1 00:04:52.795 15:20:38 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:52.795 15:20:38 -- scripts/common.sh@343 -- # case "$op" in 00:04:52.795 15:20:38 -- scripts/common.sh@344 -- # : 1 00:04:52.795 15:20:38 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:52.795 15:20:38 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:52.795 15:20:38 -- scripts/common.sh@364 -- # decimal 1 00:04:52.795 15:20:38 -- scripts/common.sh@352 -- # local d=1 00:04:52.795 15:20:38 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:52.795 15:20:38 -- scripts/common.sh@354 -- # echo 1 00:04:52.795 15:20:38 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:52.795 15:20:38 -- scripts/common.sh@365 -- # decimal 2 00:04:52.795 15:20:38 -- scripts/common.sh@352 -- # local d=2 00:04:52.795 15:20:38 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:52.795 15:20:38 -- scripts/common.sh@354 -- # echo 2 00:04:52.795 15:20:38 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:52.795 15:20:38 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:52.795 15:20:38 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:52.795 15:20:38 -- scripts/common.sh@367 -- # return 0 00:04:52.795 15:20:38 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:52.795 15:20:38 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:52.795 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:52.795 --rc genhtml_branch_coverage=1 00:04:52.795 --rc genhtml_function_coverage=1 00:04:52.795 --rc genhtml_legend=1 00:04:52.795 --rc geninfo_all_blocks=1 00:04:52.795 --rc geninfo_unexecuted_blocks=1 00:04:52.795 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:52.795 ' 00:04:52.795 15:20:38 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:52.795 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:52.796 --rc genhtml_branch_coverage=1 00:04:52.796 --rc genhtml_function_coverage=1 00:04:52.796 --rc genhtml_legend=1 00:04:52.796 --rc geninfo_all_blocks=1 00:04:52.796 --rc geninfo_unexecuted_blocks=1 00:04:52.796 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:52.796 ' 00:04:52.796 15:20:38 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:52.796 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:52.796 --rc genhtml_branch_coverage=1 00:04:52.796 --rc genhtml_function_coverage=1 00:04:52.796 --rc genhtml_legend=1 00:04:52.796 --rc geninfo_all_blocks=1 00:04:52.796 --rc geninfo_unexecuted_blocks=1 00:04:52.796 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:52.796 ' 00:04:52.796 15:20:38 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:52.796 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:52.796 --rc genhtml_branch_coverage=1 00:04:52.796 --rc genhtml_function_coverage=1 00:04:52.796 --rc genhtml_legend=1 00:04:52.796 --rc geninfo_all_blocks=1 00:04:52.796 --rc geninfo_unexecuted_blocks=1 00:04:52.796 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:52.796 ' 00:04:52.796 15:20:38 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:04:52.796 15:20:38 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1574547 00:04:52.796 15:20:38 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:04:52.796 15:20:38 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:04:52.796 15:20:38 -- scheduler/scheduler.sh@37 -- # waitforlisten 1574547 00:04:52.796 15:20:38 -- common/autotest_common.sh@829 -- # '[' -z 1574547 ']' 00:04:52.796 15:20:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:52.796 15:20:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:52.796 15:20:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:52.796 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:52.796 15:20:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:52.796 15:20:38 -- common/autotest_common.sh@10 -- # set +x 00:04:52.796 [2024-12-02 15:20:38.500715] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:52.796 [2024-12-02 15:20:38.500823] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1574547 ] 00:04:52.796 EAL: No free 2048 kB hugepages reported on node 1 00:04:52.796 [2024-12-02 15:20:38.565141] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:04:53.055 [2024-12-02 15:20:38.641752] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:53.055 [2024-12-02 15:20:38.641839] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:53.055 [2024-12-02 15:20:38.641921] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:04:53.055 [2024-12-02 15:20:38.641923] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:04:53.625 15:20:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:53.625 15:20:39 -- common/autotest_common.sh@862 -- # return 0 00:04:53.625 15:20:39 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:04:53.625 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.625 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.625 POWER: Env isn't set yet! 00:04:53.625 POWER: Attempting to initialise ACPI cpufreq power management... 00:04:53.625 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:04:53.625 POWER: Cannot set governor of lcore 0 to userspace 00:04:53.625 POWER: Attempting to initialise PSTAT power management... 00:04:53.625 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:04:53.625 POWER: Initialized successfully for lcore 0 power management 00:04:53.625 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:04:53.625 POWER: Initialized successfully for lcore 1 power management 00:04:53.625 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:04:53.625 POWER: Initialized successfully for lcore 2 power management 00:04:53.625 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:04:53.625 POWER: Initialized successfully for lcore 3 power management 00:04:53.625 [2024-12-02 15:20:39.394926] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:04:53.625 [2024-12-02 15:20:39.394942] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:04:53.625 [2024-12-02 15:20:39.394952] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:04:53.625 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.625 15:20:39 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:04:53.625 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.625 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 [2024-12-02 15:20:39.463373] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:04:53.886 15:20:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:53.886 15:20:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 ************************************ 00:04:53.886 START TEST scheduler_create_thread 00:04:53.886 ************************************ 00:04:53.886 15:20:39 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 2 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 3 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 4 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 5 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 6 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 7 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 8 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 9 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:53.886 10 00:04:53.886 15:20:39 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.886 15:20:39 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:04:53.886 15:20:39 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.886 15:20:39 -- common/autotest_common.sh@10 -- # set +x 00:04:54.456 15:20:40 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:54.456 15:20:40 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:04:54.456 15:20:40 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:04:54.456 15:20:40 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:54.456 15:20:40 -- common/autotest_common.sh@10 -- # set +x 00:04:55.394 15:20:40 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:55.394 15:20:40 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:04:55.394 15:20:40 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:55.394 15:20:40 -- common/autotest_common.sh@10 -- # set +x 00:04:56.332 15:20:41 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:56.332 15:20:41 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:04:56.332 15:20:41 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:04:56.332 15:20:41 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:56.332 15:20:41 -- common/autotest_common.sh@10 -- # set +x 00:04:56.899 15:20:42 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:56.899 00:04:56.899 real 0m3.232s 00:04:56.899 user 0m0.024s 00:04:56.899 sys 0m0.007s 00:04:56.899 15:20:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:56.899 15:20:42 -- common/autotest_common.sh@10 -- # set +x 00:04:56.899 ************************************ 00:04:56.899 END TEST scheduler_create_thread 00:04:56.899 ************************************ 00:04:57.158 15:20:42 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:04:57.158 15:20:42 -- scheduler/scheduler.sh@46 -- # killprocess 1574547 00:04:57.158 15:20:42 -- common/autotest_common.sh@936 -- # '[' -z 1574547 ']' 00:04:57.158 15:20:42 -- common/autotest_common.sh@940 -- # kill -0 1574547 00:04:57.158 15:20:42 -- common/autotest_common.sh@941 -- # uname 00:04:57.158 15:20:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:57.158 15:20:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1574547 00:04:57.158 15:20:42 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:04:57.158 15:20:42 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:04:57.158 15:20:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1574547' 00:04:57.158 killing process with pid 1574547 00:04:57.158 15:20:42 -- common/autotest_common.sh@955 -- # kill 1574547 00:04:57.158 15:20:42 -- common/autotest_common.sh@960 -- # wait 1574547 00:04:57.421 [2024-12-02 15:20:43.084510] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:04:57.421 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:04:57.421 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:04:57.421 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:04:57.421 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:04:57.421 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:04:57.421 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:04:57.421 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:04:57.421 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:04:57.697 00:04:57.697 real 0m5.029s 00:04:57.697 user 0m10.292s 00:04:57.697 sys 0m0.418s 00:04:57.697 15:20:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:57.697 15:20:43 -- common/autotest_common.sh@10 -- # set +x 00:04:57.697 ************************************ 00:04:57.697 END TEST event_scheduler 00:04:57.697 ************************************ 00:04:57.697 15:20:43 -- event/event.sh@51 -- # modprobe -n nbd 00:04:57.697 15:20:43 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:04:57.697 15:20:43 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:57.697 15:20:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:57.697 15:20:43 -- common/autotest_common.sh@10 -- # set +x 00:04:57.697 ************************************ 00:04:57.697 START TEST app_repeat 00:04:57.697 ************************************ 00:04:57.697 15:20:43 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:04:57.697 15:20:43 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:57.697 15:20:43 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:57.697 15:20:43 -- event/event.sh@13 -- # local nbd_list 00:04:57.697 15:20:43 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:57.697 15:20:43 -- event/event.sh@14 -- # local bdev_list 00:04:57.697 15:20:43 -- event/event.sh@15 -- # local repeat_times=4 00:04:57.697 15:20:43 -- event/event.sh@17 -- # modprobe nbd 00:04:57.697 15:20:43 -- event/event.sh@19 -- # repeat_pid=1575404 00:04:57.697 15:20:43 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:04:57.697 15:20:43 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:04:57.697 15:20:43 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1575404' 00:04:57.697 Process app_repeat pid: 1575404 00:04:57.697 15:20:43 -- event/event.sh@23 -- # for i in {0..2} 00:04:57.697 15:20:43 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:04:57.697 spdk_app_start Round 0 00:04:57.697 15:20:43 -- event/event.sh@25 -- # waitforlisten 1575404 /var/tmp/spdk-nbd.sock 00:04:57.697 15:20:43 -- common/autotest_common.sh@829 -- # '[' -z 1575404 ']' 00:04:57.697 15:20:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:04:57.697 15:20:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:57.697 15:20:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:04:57.697 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:04:57.697 15:20:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:57.697 15:20:43 -- common/autotest_common.sh@10 -- # set +x 00:04:57.697 [2024-12-02 15:20:43.415436] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:57.697 [2024-12-02 15:20:43.415525] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1575404 ] 00:04:57.697 EAL: No free 2048 kB hugepages reported on node 1 00:04:57.697 [2024-12-02 15:20:43.485282] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:58.023 [2024-12-02 15:20:43.562418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:58.023 [2024-12-02 15:20:43.562420] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:58.635 15:20:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:58.635 15:20:44 -- common/autotest_common.sh@862 -- # return 0 00:04:58.635 15:20:44 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:58.635 Malloc0 00:04:58.635 15:20:44 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:58.895 Malloc1 00:04:58.895 15:20:44 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@12 -- # local i 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:58.895 15:20:44 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:04:59.154 /dev/nbd0 00:04:59.154 15:20:44 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:04:59.154 15:20:44 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:04:59.154 15:20:44 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:04:59.154 15:20:44 -- common/autotest_common.sh@867 -- # local i 00:04:59.154 15:20:44 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:04:59.154 15:20:44 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:04:59.154 15:20:44 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:04:59.154 15:20:44 -- common/autotest_common.sh@871 -- # break 00:04:59.154 15:20:44 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:04:59.154 15:20:44 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:04:59.154 15:20:44 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:59.154 1+0 records in 00:04:59.154 1+0 records out 00:04:59.155 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255493 s, 16.0 MB/s 00:04:59.155 15:20:44 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.155 15:20:44 -- common/autotest_common.sh@884 -- # size=4096 00:04:59.155 15:20:44 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.155 15:20:44 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:04:59.155 15:20:44 -- common/autotest_common.sh@887 -- # return 0 00:04:59.155 15:20:44 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:59.155 15:20:44 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:59.155 15:20:44 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:04:59.413 /dev/nbd1 00:04:59.413 15:20:45 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:04:59.413 15:20:45 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:04:59.413 15:20:45 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:04:59.414 15:20:45 -- common/autotest_common.sh@867 -- # local i 00:04:59.414 15:20:45 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:04:59.414 15:20:45 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:04:59.414 15:20:45 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:04:59.414 15:20:45 -- common/autotest_common.sh@871 -- # break 00:04:59.414 15:20:45 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:04:59.414 15:20:45 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:04:59.414 15:20:45 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:59.414 1+0 records in 00:04:59.414 1+0 records out 00:04:59.414 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000199447 s, 20.5 MB/s 00:04:59.414 15:20:45 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.414 15:20:45 -- common/autotest_common.sh@884 -- # size=4096 00:04:59.414 15:20:45 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:59.414 15:20:45 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:04:59.414 15:20:45 -- common/autotest_common.sh@887 -- # return 0 00:04:59.414 15:20:45 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:59.414 15:20:45 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:59.414 15:20:45 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:59.414 15:20:45 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:59.414 15:20:45 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:04:59.672 { 00:04:59.672 "nbd_device": "/dev/nbd0", 00:04:59.672 "bdev_name": "Malloc0" 00:04:59.672 }, 00:04:59.672 { 00:04:59.672 "nbd_device": "/dev/nbd1", 00:04:59.672 "bdev_name": "Malloc1" 00:04:59.672 } 00:04:59.672 ]' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@64 -- # echo '[ 00:04:59.672 { 00:04:59.672 "nbd_device": "/dev/nbd0", 00:04:59.672 "bdev_name": "Malloc0" 00:04:59.672 }, 00:04:59.672 { 00:04:59.672 "nbd_device": "/dev/nbd1", 00:04:59.672 "bdev_name": "Malloc1" 00:04:59.672 } 00:04:59.672 ]' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:04:59.672 /dev/nbd1' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:04:59.672 /dev/nbd1' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@65 -- # count=2 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@66 -- # echo 2 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@95 -- # count=2 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@71 -- # local operation=write 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:04:59.672 256+0 records in 00:04:59.672 256+0 records out 00:04:59.672 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105818 s, 99.1 MB/s 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:04:59.672 256+0 records in 00:04:59.672 256+0 records out 00:04:59.672 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197748 s, 53.0 MB/s 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:04:59.672 256+0 records in 00:04:59.672 256+0 records out 00:04:59.672 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212977 s, 49.2 MB/s 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@51 -- # local i 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:59.672 15:20:45 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@41 -- # break 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@45 -- # return 0 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:59.930 15:20:45 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@41 -- # break 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@45 -- # return 0 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:00.189 15:20:45 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@65 -- # true 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@65 -- # count=0 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@104 -- # count=0 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:00.190 15:20:45 -- bdev/nbd_common.sh@109 -- # return 0 00:05:00.190 15:20:45 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:00.449 15:20:46 -- event/event.sh@35 -- # sleep 3 00:05:00.708 [2024-12-02 15:20:46.374984] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:00.708 [2024-12-02 15:20:46.437502] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:00.708 [2024-12-02 15:20:46.437504] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:00.708 [2024-12-02 15:20:46.478061] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:00.708 [2024-12-02 15:20:46.478101] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:03.997 15:20:49 -- event/event.sh@23 -- # for i in {0..2} 00:05:03.998 15:20:49 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:03.998 spdk_app_start Round 1 00:05:03.998 15:20:49 -- event/event.sh@25 -- # waitforlisten 1575404 /var/tmp/spdk-nbd.sock 00:05:03.998 15:20:49 -- common/autotest_common.sh@829 -- # '[' -z 1575404 ']' 00:05:03.998 15:20:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:03.998 15:20:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:03.998 15:20:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:03.998 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:03.998 15:20:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:03.998 15:20:49 -- common/autotest_common.sh@10 -- # set +x 00:05:03.998 15:20:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:03.998 15:20:49 -- common/autotest_common.sh@862 -- # return 0 00:05:03.998 15:20:49 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:03.998 Malloc0 00:05:03.998 15:20:49 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:03.998 Malloc1 00:05:03.998 15:20:49 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@12 -- # local i 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:03.998 15:20:49 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:04.257 /dev/nbd0 00:05:04.257 15:20:49 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:04.258 15:20:49 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:04.258 15:20:49 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:04.258 15:20:49 -- common/autotest_common.sh@867 -- # local i 00:05:04.258 15:20:49 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:04.258 15:20:49 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:04.258 15:20:49 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:04.258 15:20:49 -- common/autotest_common.sh@871 -- # break 00:05:04.258 15:20:49 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:04.258 15:20:49 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:04.258 15:20:49 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:04.258 1+0 records in 00:05:04.258 1+0 records out 00:05:04.258 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238346 s, 17.2 MB/s 00:05:04.258 15:20:49 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.258 15:20:49 -- common/autotest_common.sh@884 -- # size=4096 00:05:04.258 15:20:49 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.258 15:20:49 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:04.258 15:20:49 -- common/autotest_common.sh@887 -- # return 0 00:05:04.258 15:20:49 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:04.258 15:20:49 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:04.258 15:20:49 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:04.517 /dev/nbd1 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:04.517 15:20:50 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:04.517 15:20:50 -- common/autotest_common.sh@867 -- # local i 00:05:04.517 15:20:50 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:04.517 15:20:50 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:04.517 15:20:50 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:04.517 15:20:50 -- common/autotest_common.sh@871 -- # break 00:05:04.517 15:20:50 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:04.517 15:20:50 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:04.517 15:20:50 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:04.517 1+0 records in 00:05:04.517 1+0 records out 00:05:04.517 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253692 s, 16.1 MB/s 00:05:04.517 15:20:50 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.517 15:20:50 -- common/autotest_common.sh@884 -- # size=4096 00:05:04.517 15:20:50 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:04.517 15:20:50 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:04.517 15:20:50 -- common/autotest_common.sh@887 -- # return 0 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:04.517 15:20:50 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:04.776 { 00:05:04.776 "nbd_device": "/dev/nbd0", 00:05:04.776 "bdev_name": "Malloc0" 00:05:04.776 }, 00:05:04.776 { 00:05:04.776 "nbd_device": "/dev/nbd1", 00:05:04.776 "bdev_name": "Malloc1" 00:05:04.776 } 00:05:04.776 ]' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:04.776 { 00:05:04.776 "nbd_device": "/dev/nbd0", 00:05:04.776 "bdev_name": "Malloc0" 00:05:04.776 }, 00:05:04.776 { 00:05:04.776 "nbd_device": "/dev/nbd1", 00:05:04.776 "bdev_name": "Malloc1" 00:05:04.776 } 00:05:04.776 ]' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:04.776 /dev/nbd1' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:04.776 /dev/nbd1' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@65 -- # count=2 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@95 -- # count=2 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:04.776 256+0 records in 00:05:04.776 256+0 records out 00:05:04.776 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115492 s, 90.8 MB/s 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:04.776 256+0 records in 00:05:04.776 256+0 records out 00:05:04.776 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200838 s, 52.2 MB/s 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:04.776 256+0 records in 00:05:04.776 256+0 records out 00:05:04.776 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0212996 s, 49.2 MB/s 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@51 -- # local i 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:04.776 15:20:50 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@41 -- # break 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@45 -- # return 0 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:05.035 15:20:50 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@41 -- # break 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@45 -- # return 0 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:05.294 15:20:50 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@65 -- # true 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@65 -- # count=0 00:05:05.294 15:20:51 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:05.554 15:20:51 -- bdev/nbd_common.sh@104 -- # count=0 00:05:05.554 15:20:51 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:05.554 15:20:51 -- bdev/nbd_common.sh@109 -- # return 0 00:05:05.554 15:20:51 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:05.554 15:20:51 -- event/event.sh@35 -- # sleep 3 00:05:05.813 [2024-12-02 15:20:51.483257] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:05.813 [2024-12-02 15:20:51.546040] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:05.813 [2024-12-02 15:20:51.546042] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:05.813 [2024-12-02 15:20:51.586647] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:05.813 [2024-12-02 15:20:51.586690] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:09.101 15:20:54 -- event/event.sh@23 -- # for i in {0..2} 00:05:09.101 15:20:54 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:09.101 spdk_app_start Round 2 00:05:09.101 15:20:54 -- event/event.sh@25 -- # waitforlisten 1575404 /var/tmp/spdk-nbd.sock 00:05:09.101 15:20:54 -- common/autotest_common.sh@829 -- # '[' -z 1575404 ']' 00:05:09.101 15:20:54 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:09.101 15:20:54 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:09.101 15:20:54 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:09.101 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:09.101 15:20:54 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:09.101 15:20:54 -- common/autotest_common.sh@10 -- # set +x 00:05:09.101 15:20:54 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:09.101 15:20:54 -- common/autotest_common.sh@862 -- # return 0 00:05:09.101 15:20:54 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:09.101 Malloc0 00:05:09.101 15:20:54 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:09.101 Malloc1 00:05:09.101 15:20:54 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:09.101 15:20:54 -- bdev/nbd_common.sh@12 -- # local i 00:05:09.102 15:20:54 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:09.102 15:20:54 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:09.102 15:20:54 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:09.360 /dev/nbd0 00:05:09.360 15:20:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:09.360 15:20:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:09.361 15:20:55 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:09.361 15:20:55 -- common/autotest_common.sh@867 -- # local i 00:05:09.361 15:20:55 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:09.361 15:20:55 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:09.361 15:20:55 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:09.361 15:20:55 -- common/autotest_common.sh@871 -- # break 00:05:09.361 15:20:55 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:09.361 15:20:55 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:09.361 15:20:55 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:09.361 1+0 records in 00:05:09.361 1+0 records out 00:05:09.361 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253376 s, 16.2 MB/s 00:05:09.361 15:20:55 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.361 15:20:55 -- common/autotest_common.sh@884 -- # size=4096 00:05:09.361 15:20:55 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.361 15:20:55 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:09.361 15:20:55 -- common/autotest_common.sh@887 -- # return 0 00:05:09.361 15:20:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:09.361 15:20:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:09.361 15:20:55 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:09.619 /dev/nbd1 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:09.619 15:20:55 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:09.619 15:20:55 -- common/autotest_common.sh@867 -- # local i 00:05:09.619 15:20:55 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:09.619 15:20:55 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:09.619 15:20:55 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:09.619 15:20:55 -- common/autotest_common.sh@871 -- # break 00:05:09.619 15:20:55 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:09.619 15:20:55 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:09.619 15:20:55 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:09.619 1+0 records in 00:05:09.619 1+0 records out 00:05:09.619 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000231354 s, 17.7 MB/s 00:05:09.619 15:20:55 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.619 15:20:55 -- common/autotest_common.sh@884 -- # size=4096 00:05:09.619 15:20:55 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:09.619 15:20:55 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:09.619 15:20:55 -- common/autotest_common.sh@887 -- # return 0 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.619 15:20:55 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:09.877 { 00:05:09.877 "nbd_device": "/dev/nbd0", 00:05:09.877 "bdev_name": "Malloc0" 00:05:09.877 }, 00:05:09.877 { 00:05:09.877 "nbd_device": "/dev/nbd1", 00:05:09.877 "bdev_name": "Malloc1" 00:05:09.877 } 00:05:09.877 ]' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:09.877 { 00:05:09.877 "nbd_device": "/dev/nbd0", 00:05:09.877 "bdev_name": "Malloc0" 00:05:09.877 }, 00:05:09.877 { 00:05:09.877 "nbd_device": "/dev/nbd1", 00:05:09.877 "bdev_name": "Malloc1" 00:05:09.877 } 00:05:09.877 ]' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:09.877 /dev/nbd1' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:09.877 /dev/nbd1' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@65 -- # count=2 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@95 -- # count=2 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:09.877 256+0 records in 00:05:09.877 256+0 records out 00:05:09.877 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107346 s, 97.7 MB/s 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:09.877 256+0 records in 00:05:09.877 256+0 records out 00:05:09.877 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0201464 s, 52.0 MB/s 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:09.877 256+0 records in 00:05:09.877 256+0 records out 00:05:09.877 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211328 s, 49.6 MB/s 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@51 -- # local i 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:09.877 15:20:55 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@41 -- # break 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@45 -- # return 0 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:10.135 15:20:55 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@41 -- # break 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@45 -- # return 0 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:10.393 15:20:56 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@65 -- # true 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@65 -- # count=0 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@104 -- # count=0 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:10.652 15:20:56 -- bdev/nbd_common.sh@109 -- # return 0 00:05:10.652 15:20:56 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:10.652 15:20:56 -- event/event.sh@35 -- # sleep 3 00:05:10.911 [2024-12-02 15:20:56.621975] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:10.911 [2024-12-02 15:20:56.684723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:10.911 [2024-12-02 15:20:56.684725] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:10.911 [2024-12-02 15:20:56.725221] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:10.911 [2024-12-02 15:20:56.725281] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:14.199 15:20:59 -- event/event.sh@38 -- # waitforlisten 1575404 /var/tmp/spdk-nbd.sock 00:05:14.199 15:20:59 -- common/autotest_common.sh@829 -- # '[' -z 1575404 ']' 00:05:14.199 15:20:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:14.199 15:20:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:14.199 15:20:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:14.199 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:14.199 15:20:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:14.199 15:20:59 -- common/autotest_common.sh@10 -- # set +x 00:05:14.199 15:20:59 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:14.199 15:20:59 -- common/autotest_common.sh@862 -- # return 0 00:05:14.199 15:20:59 -- event/event.sh@39 -- # killprocess 1575404 00:05:14.199 15:20:59 -- common/autotest_common.sh@936 -- # '[' -z 1575404 ']' 00:05:14.199 15:20:59 -- common/autotest_common.sh@940 -- # kill -0 1575404 00:05:14.199 15:20:59 -- common/autotest_common.sh@941 -- # uname 00:05:14.199 15:20:59 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:14.199 15:20:59 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1575404 00:05:14.199 15:20:59 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:14.199 15:20:59 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:14.199 15:20:59 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1575404' 00:05:14.199 killing process with pid 1575404 00:05:14.199 15:20:59 -- common/autotest_common.sh@955 -- # kill 1575404 00:05:14.199 15:20:59 -- common/autotest_common.sh@960 -- # wait 1575404 00:05:14.199 spdk_app_start is called in Round 0. 00:05:14.199 Shutdown signal received, stop current app iteration 00:05:14.199 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:14.199 spdk_app_start is called in Round 1. 00:05:14.199 Shutdown signal received, stop current app iteration 00:05:14.199 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:14.199 spdk_app_start is called in Round 2. 00:05:14.199 Shutdown signal received, stop current app iteration 00:05:14.199 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:14.199 spdk_app_start is called in Round 3. 00:05:14.199 Shutdown signal received, stop current app iteration 00:05:14.199 15:20:59 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:14.199 15:20:59 -- event/event.sh@42 -- # return 0 00:05:14.199 00:05:14.199 real 0m16.464s 00:05:14.199 user 0m35.130s 00:05:14.199 sys 0m3.057s 00:05:14.199 15:20:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:14.199 15:20:59 -- common/autotest_common.sh@10 -- # set +x 00:05:14.199 ************************************ 00:05:14.199 END TEST app_repeat 00:05:14.199 ************************************ 00:05:14.199 15:20:59 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:14.199 15:20:59 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:14.199 15:20:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:14.199 15:20:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:14.199 15:20:59 -- common/autotest_common.sh@10 -- # set +x 00:05:14.199 ************************************ 00:05:14.199 START TEST cpu_locks 00:05:14.199 ************************************ 00:05:14.199 15:20:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:14.199 * Looking for test storage... 00:05:14.199 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:14.199 15:21:00 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:14.199 15:21:00 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:14.199 15:21:00 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:14.458 15:21:00 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:14.458 15:21:00 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:14.458 15:21:00 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:14.458 15:21:00 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:14.458 15:21:00 -- scripts/common.sh@335 -- # IFS=.-: 00:05:14.458 15:21:00 -- scripts/common.sh@335 -- # read -ra ver1 00:05:14.458 15:21:00 -- scripts/common.sh@336 -- # IFS=.-: 00:05:14.458 15:21:00 -- scripts/common.sh@336 -- # read -ra ver2 00:05:14.458 15:21:00 -- scripts/common.sh@337 -- # local 'op=<' 00:05:14.458 15:21:00 -- scripts/common.sh@339 -- # ver1_l=2 00:05:14.458 15:21:00 -- scripts/common.sh@340 -- # ver2_l=1 00:05:14.458 15:21:00 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:14.458 15:21:00 -- scripts/common.sh@343 -- # case "$op" in 00:05:14.458 15:21:00 -- scripts/common.sh@344 -- # : 1 00:05:14.458 15:21:00 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:14.458 15:21:00 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:14.458 15:21:00 -- scripts/common.sh@364 -- # decimal 1 00:05:14.458 15:21:00 -- scripts/common.sh@352 -- # local d=1 00:05:14.458 15:21:00 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:14.458 15:21:00 -- scripts/common.sh@354 -- # echo 1 00:05:14.458 15:21:00 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:14.459 15:21:00 -- scripts/common.sh@365 -- # decimal 2 00:05:14.459 15:21:00 -- scripts/common.sh@352 -- # local d=2 00:05:14.459 15:21:00 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:14.459 15:21:00 -- scripts/common.sh@354 -- # echo 2 00:05:14.459 15:21:00 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:14.459 15:21:00 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:14.459 15:21:00 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:14.459 15:21:00 -- scripts/common.sh@367 -- # return 0 00:05:14.459 15:21:00 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:14.459 15:21:00 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:14.459 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.459 --rc genhtml_branch_coverage=1 00:05:14.459 --rc genhtml_function_coverage=1 00:05:14.459 --rc genhtml_legend=1 00:05:14.459 --rc geninfo_all_blocks=1 00:05:14.459 --rc geninfo_unexecuted_blocks=1 00:05:14.459 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.459 ' 00:05:14.459 15:21:00 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:14.459 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.459 --rc genhtml_branch_coverage=1 00:05:14.459 --rc genhtml_function_coverage=1 00:05:14.459 --rc genhtml_legend=1 00:05:14.459 --rc geninfo_all_blocks=1 00:05:14.459 --rc geninfo_unexecuted_blocks=1 00:05:14.459 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.459 ' 00:05:14.459 15:21:00 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:14.459 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.459 --rc genhtml_branch_coverage=1 00:05:14.459 --rc genhtml_function_coverage=1 00:05:14.459 --rc genhtml_legend=1 00:05:14.459 --rc geninfo_all_blocks=1 00:05:14.459 --rc geninfo_unexecuted_blocks=1 00:05:14.459 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.459 ' 00:05:14.459 15:21:00 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:14.459 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.459 --rc genhtml_branch_coverage=1 00:05:14.459 --rc genhtml_function_coverage=1 00:05:14.459 --rc genhtml_legend=1 00:05:14.459 --rc geninfo_all_blocks=1 00:05:14.459 --rc geninfo_unexecuted_blocks=1 00:05:14.459 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.459 ' 00:05:14.459 15:21:00 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:14.459 15:21:00 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:14.459 15:21:00 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:14.459 15:21:00 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:14.459 15:21:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:14.459 15:21:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:14.459 15:21:00 -- common/autotest_common.sh@10 -- # set +x 00:05:14.459 ************************************ 00:05:14.459 START TEST default_locks 00:05:14.459 ************************************ 00:05:14.459 15:21:00 -- common/autotest_common.sh@1114 -- # default_locks 00:05:14.459 15:21:00 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1578634 00:05:14.459 15:21:00 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:14.459 15:21:00 -- event/cpu_locks.sh@47 -- # waitforlisten 1578634 00:05:14.459 15:21:00 -- common/autotest_common.sh@829 -- # '[' -z 1578634 ']' 00:05:14.459 15:21:00 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.459 15:21:00 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:14.459 15:21:00 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.459 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.459 15:21:00 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:14.459 15:21:00 -- common/autotest_common.sh@10 -- # set +x 00:05:14.459 [2024-12-02 15:21:00.125171] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:14.459 [2024-12-02 15:21:00.125252] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1578634 ] 00:05:14.459 EAL: No free 2048 kB hugepages reported on node 1 00:05:14.459 [2024-12-02 15:21:00.195051] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.459 [2024-12-02 15:21:00.270192] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:14.459 [2024-12-02 15:21:00.270302] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:15.395 15:21:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:15.395 15:21:00 -- common/autotest_common.sh@862 -- # return 0 00:05:15.395 15:21:00 -- event/cpu_locks.sh@49 -- # locks_exist 1578634 00:05:15.395 15:21:00 -- event/cpu_locks.sh@22 -- # lslocks -p 1578634 00:05:15.395 15:21:00 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:15.655 lslocks: write error 00:05:15.655 15:21:01 -- event/cpu_locks.sh@50 -- # killprocess 1578634 00:05:15.655 15:21:01 -- common/autotest_common.sh@936 -- # '[' -z 1578634 ']' 00:05:15.655 15:21:01 -- common/autotest_common.sh@940 -- # kill -0 1578634 00:05:15.655 15:21:01 -- common/autotest_common.sh@941 -- # uname 00:05:15.655 15:21:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:15.655 15:21:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1578634 00:05:15.655 15:21:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:15.655 15:21:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:15.655 15:21:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1578634' 00:05:15.655 killing process with pid 1578634 00:05:15.655 15:21:01 -- common/autotest_common.sh@955 -- # kill 1578634 00:05:15.655 15:21:01 -- common/autotest_common.sh@960 -- # wait 1578634 00:05:15.915 15:21:01 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1578634 00:05:15.915 15:21:01 -- common/autotest_common.sh@650 -- # local es=0 00:05:15.915 15:21:01 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1578634 00:05:15.915 15:21:01 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:15.915 15:21:01 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:15.915 15:21:01 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:15.915 15:21:01 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:15.915 15:21:01 -- common/autotest_common.sh@653 -- # waitforlisten 1578634 00:05:15.915 15:21:01 -- common/autotest_common.sh@829 -- # '[' -z 1578634 ']' 00:05:15.915 15:21:01 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:15.915 15:21:01 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:15.915 15:21:01 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:15.915 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:15.915 15:21:01 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:15.915 15:21:01 -- common/autotest_common.sh@10 -- # set +x 00:05:15.915 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1578634) - No such process 00:05:15.915 ERROR: process (pid: 1578634) is no longer running 00:05:15.915 15:21:01 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:15.915 15:21:01 -- common/autotest_common.sh@862 -- # return 1 00:05:15.915 15:21:01 -- common/autotest_common.sh@653 -- # es=1 00:05:15.915 15:21:01 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:15.915 15:21:01 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:15.915 15:21:01 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:15.915 15:21:01 -- event/cpu_locks.sh@54 -- # no_locks 00:05:15.915 15:21:01 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:15.915 15:21:01 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:15.915 15:21:01 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:15.915 00:05:15.915 real 0m1.606s 00:05:15.915 user 0m1.683s 00:05:15.915 sys 0m0.560s 00:05:15.915 15:21:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:15.915 15:21:01 -- common/autotest_common.sh@10 -- # set +x 00:05:15.915 ************************************ 00:05:15.915 END TEST default_locks 00:05:15.915 ************************************ 00:05:16.175 15:21:01 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:16.175 15:21:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:16.175 15:21:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:16.175 15:21:01 -- common/autotest_common.sh@10 -- # set +x 00:05:16.175 ************************************ 00:05:16.175 START TEST default_locks_via_rpc 00:05:16.175 ************************************ 00:05:16.175 15:21:01 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:05:16.175 15:21:01 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1579042 00:05:16.175 15:21:01 -- event/cpu_locks.sh@63 -- # waitforlisten 1579042 00:05:16.175 15:21:01 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:16.175 15:21:01 -- common/autotest_common.sh@829 -- # '[' -z 1579042 ']' 00:05:16.175 15:21:01 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:16.175 15:21:01 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:16.175 15:21:01 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:16.175 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:16.175 15:21:01 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:16.175 15:21:01 -- common/autotest_common.sh@10 -- # set +x 00:05:16.175 [2024-12-02 15:21:01.780997] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:16.175 [2024-12-02 15:21:01.781062] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579042 ] 00:05:16.175 EAL: No free 2048 kB hugepages reported on node 1 00:05:16.175 [2024-12-02 15:21:01.848860] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:16.175 [2024-12-02 15:21:01.922524] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:16.175 [2024-12-02 15:21:01.922647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:17.113 15:21:02 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:17.113 15:21:02 -- common/autotest_common.sh@862 -- # return 0 00:05:17.113 15:21:02 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:17.113 15:21:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:17.113 15:21:02 -- common/autotest_common.sh@10 -- # set +x 00:05:17.113 15:21:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:17.113 15:21:02 -- event/cpu_locks.sh@67 -- # no_locks 00:05:17.113 15:21:02 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:17.113 15:21:02 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:17.113 15:21:02 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:17.113 15:21:02 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:17.113 15:21:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:17.113 15:21:02 -- common/autotest_common.sh@10 -- # set +x 00:05:17.113 15:21:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:17.113 15:21:02 -- event/cpu_locks.sh@71 -- # locks_exist 1579042 00:05:17.113 15:21:02 -- event/cpu_locks.sh@22 -- # lslocks -p 1579042 00:05:17.113 15:21:02 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:17.372 15:21:03 -- event/cpu_locks.sh@73 -- # killprocess 1579042 00:05:17.372 15:21:03 -- common/autotest_common.sh@936 -- # '[' -z 1579042 ']' 00:05:17.372 15:21:03 -- common/autotest_common.sh@940 -- # kill -0 1579042 00:05:17.372 15:21:03 -- common/autotest_common.sh@941 -- # uname 00:05:17.372 15:21:03 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:17.372 15:21:03 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1579042 00:05:17.372 15:21:03 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:17.372 15:21:03 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:17.630 15:21:03 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1579042' 00:05:17.630 killing process with pid 1579042 00:05:17.630 15:21:03 -- common/autotest_common.sh@955 -- # kill 1579042 00:05:17.630 15:21:03 -- common/autotest_common.sh@960 -- # wait 1579042 00:05:17.888 00:05:17.888 real 0m1.737s 00:05:17.888 user 0m1.843s 00:05:17.888 sys 0m0.603s 00:05:17.888 15:21:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:17.888 15:21:03 -- common/autotest_common.sh@10 -- # set +x 00:05:17.888 ************************************ 00:05:17.888 END TEST default_locks_via_rpc 00:05:17.888 ************************************ 00:05:17.888 15:21:03 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:17.888 15:21:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:17.888 15:21:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:17.888 15:21:03 -- common/autotest_common.sh@10 -- # set +x 00:05:17.888 ************************************ 00:05:17.888 START TEST non_locking_app_on_locked_coremask 00:05:17.888 ************************************ 00:05:17.888 15:21:03 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:05:17.888 15:21:03 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1579346 00:05:17.888 15:21:03 -- event/cpu_locks.sh@81 -- # waitforlisten 1579346 /var/tmp/spdk.sock 00:05:17.888 15:21:03 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:17.888 15:21:03 -- common/autotest_common.sh@829 -- # '[' -z 1579346 ']' 00:05:17.888 15:21:03 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:17.888 15:21:03 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:17.888 15:21:03 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:17.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:17.888 15:21:03 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:17.888 15:21:03 -- common/autotest_common.sh@10 -- # set +x 00:05:17.888 [2024-12-02 15:21:03.565116] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:17.888 [2024-12-02 15:21:03.565186] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579346 ] 00:05:17.888 EAL: No free 2048 kB hugepages reported on node 1 00:05:17.888 [2024-12-02 15:21:03.632523] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:17.888 [2024-12-02 15:21:03.698710] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:17.888 [2024-12-02 15:21:03.698827] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:18.828 15:21:04 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:18.828 15:21:04 -- common/autotest_common.sh@862 -- # return 0 00:05:18.828 15:21:04 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:18.828 15:21:04 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1579799 00:05:18.829 15:21:04 -- event/cpu_locks.sh@85 -- # waitforlisten 1579799 /var/tmp/spdk2.sock 00:05:18.829 15:21:04 -- common/autotest_common.sh@829 -- # '[' -z 1579799 ']' 00:05:18.829 15:21:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:18.829 15:21:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:18.829 15:21:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:18.829 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:18.829 15:21:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:18.829 15:21:04 -- common/autotest_common.sh@10 -- # set +x 00:05:18.829 [2024-12-02 15:21:04.409423] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:18.829 [2024-12-02 15:21:04.409476] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1579799 ] 00:05:18.829 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.829 [2024-12-02 15:21:04.501743] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:18.829 [2024-12-02 15:21:04.501776] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:19.088 [2024-12-02 15:21:04.652248] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:19.088 [2024-12-02 15:21:04.652367] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.653 15:21:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:19.653 15:21:05 -- common/autotest_common.sh@862 -- # return 0 00:05:19.653 15:21:05 -- event/cpu_locks.sh@87 -- # locks_exist 1579346 00:05:19.653 15:21:05 -- event/cpu_locks.sh@22 -- # lslocks -p 1579346 00:05:19.653 15:21:05 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:20.587 lslocks: write error 00:05:20.587 15:21:06 -- event/cpu_locks.sh@89 -- # killprocess 1579346 00:05:20.587 15:21:06 -- common/autotest_common.sh@936 -- # '[' -z 1579346 ']' 00:05:20.587 15:21:06 -- common/autotest_common.sh@940 -- # kill -0 1579346 00:05:20.587 15:21:06 -- common/autotest_common.sh@941 -- # uname 00:05:20.587 15:21:06 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:20.587 15:21:06 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1579346 00:05:20.587 15:21:06 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:20.587 15:21:06 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:20.587 15:21:06 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1579346' 00:05:20.587 killing process with pid 1579346 00:05:20.587 15:21:06 -- common/autotest_common.sh@955 -- # kill 1579346 00:05:20.587 15:21:06 -- common/autotest_common.sh@960 -- # wait 1579346 00:05:21.153 15:21:06 -- event/cpu_locks.sh@90 -- # killprocess 1579799 00:05:21.153 15:21:06 -- common/autotest_common.sh@936 -- # '[' -z 1579799 ']' 00:05:21.153 15:21:06 -- common/autotest_common.sh@940 -- # kill -0 1579799 00:05:21.153 15:21:06 -- common/autotest_common.sh@941 -- # uname 00:05:21.153 15:21:06 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:21.153 15:21:06 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1579799 00:05:21.153 15:21:06 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:21.153 15:21:06 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:21.153 15:21:06 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1579799' 00:05:21.153 killing process with pid 1579799 00:05:21.153 15:21:06 -- common/autotest_common.sh@955 -- # kill 1579799 00:05:21.153 15:21:06 -- common/autotest_common.sh@960 -- # wait 1579799 00:05:21.411 00:05:21.411 real 0m3.651s 00:05:21.411 user 0m3.912s 00:05:21.411 sys 0m1.140s 00:05:21.411 15:21:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:21.411 15:21:07 -- common/autotest_common.sh@10 -- # set +x 00:05:21.411 ************************************ 00:05:21.411 END TEST non_locking_app_on_locked_coremask 00:05:21.411 ************************************ 00:05:21.411 15:21:07 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:21.411 15:21:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:21.411 15:21:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:21.411 15:21:07 -- common/autotest_common.sh@10 -- # set +x 00:05:21.668 ************************************ 00:05:21.669 START TEST locking_app_on_unlocked_coremask 00:05:21.669 ************************************ 00:05:21.669 15:21:07 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:05:21.669 15:21:07 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1580597 00:05:21.669 15:21:07 -- event/cpu_locks.sh@99 -- # waitforlisten 1580597 /var/tmp/spdk.sock 00:05:21.669 15:21:07 -- common/autotest_common.sh@829 -- # '[' -z 1580597 ']' 00:05:21.669 15:21:07 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:21.669 15:21:07 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:21.669 15:21:07 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:21.669 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:21.669 15:21:07 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:21.669 15:21:07 -- common/autotest_common.sh@10 -- # set +x 00:05:21.669 15:21:07 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:21.669 [2024-12-02 15:21:07.258014] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:21.669 [2024-12-02 15:21:07.258081] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1580597 ] 00:05:21.669 EAL: No free 2048 kB hugepages reported on node 1 00:05:21.669 [2024-12-02 15:21:07.325207] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:21.669 [2024-12-02 15:21:07.325233] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:21.669 [2024-12-02 15:21:07.399094] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:21.669 [2024-12-02 15:21:07.399198] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.603 15:21:08 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:22.603 15:21:08 -- common/autotest_common.sh@862 -- # return 0 00:05:22.603 15:21:08 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1580629 00:05:22.603 15:21:08 -- event/cpu_locks.sh@103 -- # waitforlisten 1580629 /var/tmp/spdk2.sock 00:05:22.603 15:21:08 -- common/autotest_common.sh@829 -- # '[' -z 1580629 ']' 00:05:22.603 15:21:08 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:22.603 15:21:08 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:22.603 15:21:08 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:22.603 15:21:08 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:22.603 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:22.603 15:21:08 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:22.603 15:21:08 -- common/autotest_common.sh@10 -- # set +x 00:05:22.603 [2024-12-02 15:21:08.083454] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:22.603 [2024-12-02 15:21:08.083500] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1580629 ] 00:05:22.603 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.603 [2024-12-02 15:21:08.174901] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.603 [2024-12-02 15:21:08.319956] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:22.603 [2024-12-02 15:21:08.320073] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.172 15:21:08 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:23.172 15:21:08 -- common/autotest_common.sh@862 -- # return 0 00:05:23.172 15:21:08 -- event/cpu_locks.sh@105 -- # locks_exist 1580629 00:05:23.172 15:21:08 -- event/cpu_locks.sh@22 -- # lslocks -p 1580629 00:05:23.172 15:21:08 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:23.740 lslocks: write error 00:05:23.740 15:21:09 -- event/cpu_locks.sh@107 -- # killprocess 1580597 00:05:23.740 15:21:09 -- common/autotest_common.sh@936 -- # '[' -z 1580597 ']' 00:05:23.740 15:21:09 -- common/autotest_common.sh@940 -- # kill -0 1580597 00:05:23.740 15:21:09 -- common/autotest_common.sh@941 -- # uname 00:05:23.740 15:21:09 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:23.740 15:21:09 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1580597 00:05:23.740 15:21:09 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:23.740 15:21:09 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:23.740 15:21:09 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1580597' 00:05:23.740 killing process with pid 1580597 00:05:23.740 15:21:09 -- common/autotest_common.sh@955 -- # kill 1580597 00:05:23.740 15:21:09 -- common/autotest_common.sh@960 -- # wait 1580597 00:05:24.310 15:21:10 -- event/cpu_locks.sh@108 -- # killprocess 1580629 00:05:24.310 15:21:10 -- common/autotest_common.sh@936 -- # '[' -z 1580629 ']' 00:05:24.310 15:21:10 -- common/autotest_common.sh@940 -- # kill -0 1580629 00:05:24.310 15:21:10 -- common/autotest_common.sh@941 -- # uname 00:05:24.310 15:21:10 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:24.310 15:21:10 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1580629 00:05:24.570 15:21:10 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:24.570 15:21:10 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:24.570 15:21:10 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1580629' 00:05:24.570 killing process with pid 1580629 00:05:24.570 15:21:10 -- common/autotest_common.sh@955 -- # kill 1580629 00:05:24.570 15:21:10 -- common/autotest_common.sh@960 -- # wait 1580629 00:05:24.830 00:05:24.830 real 0m3.238s 00:05:24.830 user 0m3.445s 00:05:24.830 sys 0m0.976s 00:05:24.830 15:21:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:24.830 15:21:10 -- common/autotest_common.sh@10 -- # set +x 00:05:24.830 ************************************ 00:05:24.830 END TEST locking_app_on_unlocked_coremask 00:05:24.830 ************************************ 00:05:24.830 15:21:10 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:24.830 15:21:10 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:24.830 15:21:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:24.830 15:21:10 -- common/autotest_common.sh@10 -- # set +x 00:05:24.830 ************************************ 00:05:24.830 START TEST locking_app_on_locked_coremask 00:05:24.830 ************************************ 00:05:24.830 15:21:10 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:05:24.830 15:21:10 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1581195 00:05:24.830 15:21:10 -- event/cpu_locks.sh@116 -- # waitforlisten 1581195 /var/tmp/spdk.sock 00:05:24.830 15:21:10 -- common/autotest_common.sh@829 -- # '[' -z 1581195 ']' 00:05:24.830 15:21:10 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:24.830 15:21:10 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:24.830 15:21:10 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:24.830 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:24.830 15:21:10 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:24.830 15:21:10 -- common/autotest_common.sh@10 -- # set +x 00:05:24.830 15:21:10 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:24.830 [2024-12-02 15:21:10.541492] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:24.830 [2024-12-02 15:21:10.541582] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581195 ] 00:05:24.830 EAL: No free 2048 kB hugepages reported on node 1 00:05:24.830 [2024-12-02 15:21:10.610516] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.095 [2024-12-02 15:21:10.686415] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:25.095 [2024-12-02 15:21:10.686523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:25.758 15:21:11 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:25.758 15:21:11 -- common/autotest_common.sh@862 -- # return 0 00:05:25.758 15:21:11 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1581222 00:05:25.758 15:21:11 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1581222 /var/tmp/spdk2.sock 00:05:25.758 15:21:11 -- common/autotest_common.sh@650 -- # local es=0 00:05:25.758 15:21:11 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1581222 /var/tmp/spdk2.sock 00:05:25.758 15:21:11 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:25.758 15:21:11 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:25.758 15:21:11 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:25.758 15:21:11 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:25.758 15:21:11 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:25.758 15:21:11 -- common/autotest_common.sh@653 -- # waitforlisten 1581222 /var/tmp/spdk2.sock 00:05:25.758 15:21:11 -- common/autotest_common.sh@829 -- # '[' -z 1581222 ']' 00:05:25.758 15:21:11 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:25.759 15:21:11 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:25.759 15:21:11 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:25.759 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:25.759 15:21:11 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:25.759 15:21:11 -- common/autotest_common.sh@10 -- # set +x 00:05:25.759 [2024-12-02 15:21:11.379528] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:25.759 [2024-12-02 15:21:11.379608] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581222 ] 00:05:25.759 EAL: No free 2048 kB hugepages reported on node 1 00:05:25.759 [2024-12-02 15:21:11.476014] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1581195 has claimed it. 00:05:25.759 [2024-12-02 15:21:11.476051] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:26.327 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1581222) - No such process 00:05:26.327 ERROR: process (pid: 1581222) is no longer running 00:05:26.327 15:21:12 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:26.327 15:21:12 -- common/autotest_common.sh@862 -- # return 1 00:05:26.327 15:21:12 -- common/autotest_common.sh@653 -- # es=1 00:05:26.327 15:21:12 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:26.327 15:21:12 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:26.327 15:21:12 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:26.327 15:21:12 -- event/cpu_locks.sh@122 -- # locks_exist 1581195 00:05:26.327 15:21:12 -- event/cpu_locks.sh@22 -- # lslocks -p 1581195 00:05:26.327 15:21:12 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:26.895 lslocks: write error 00:05:26.895 15:21:12 -- event/cpu_locks.sh@124 -- # killprocess 1581195 00:05:26.895 15:21:12 -- common/autotest_common.sh@936 -- # '[' -z 1581195 ']' 00:05:26.895 15:21:12 -- common/autotest_common.sh@940 -- # kill -0 1581195 00:05:26.895 15:21:12 -- common/autotest_common.sh@941 -- # uname 00:05:26.895 15:21:12 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:26.895 15:21:12 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1581195 00:05:26.895 15:21:12 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:26.895 15:21:12 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:26.895 15:21:12 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1581195' 00:05:26.895 killing process with pid 1581195 00:05:26.895 15:21:12 -- common/autotest_common.sh@955 -- # kill 1581195 00:05:26.895 15:21:12 -- common/autotest_common.sh@960 -- # wait 1581195 00:05:27.154 00:05:27.154 real 0m2.270s 00:05:27.154 user 0m2.473s 00:05:27.154 sys 0m0.666s 00:05:27.154 15:21:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.154 15:21:12 -- common/autotest_common.sh@10 -- # set +x 00:05:27.155 ************************************ 00:05:27.155 END TEST locking_app_on_locked_coremask 00:05:27.155 ************************************ 00:05:27.155 15:21:12 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:27.155 15:21:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.155 15:21:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.155 15:21:12 -- common/autotest_common.sh@10 -- # set +x 00:05:27.155 ************************************ 00:05:27.155 START TEST locking_overlapped_coremask 00:05:27.155 ************************************ 00:05:27.155 15:21:12 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:05:27.155 15:21:12 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1581509 00:05:27.155 15:21:12 -- event/cpu_locks.sh@133 -- # waitforlisten 1581509 /var/tmp/spdk.sock 00:05:27.155 15:21:12 -- common/autotest_common.sh@829 -- # '[' -z 1581509 ']' 00:05:27.155 15:21:12 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:27.155 15:21:12 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:27.155 15:21:12 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:27.155 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:27.155 15:21:12 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:27.155 15:21:12 -- common/autotest_common.sh@10 -- # set +x 00:05:27.155 15:21:12 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:27.155 [2024-12-02 15:21:12.856416] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:27.155 [2024-12-02 15:21:12.856504] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581509 ] 00:05:27.155 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.155 [2024-12-02 15:21:12.926939] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:27.414 [2024-12-02 15:21:13.002557] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:27.414 [2024-12-02 15:21:13.002699] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:27.414 [2024-12-02 15:21:13.002817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:27.414 [2024-12-02 15:21:13.002820] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.981 15:21:13 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:27.981 15:21:13 -- common/autotest_common.sh@862 -- # return 0 00:05:27.981 15:21:13 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1581781 00:05:27.981 15:21:13 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:27.981 15:21:13 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1581781 /var/tmp/spdk2.sock 00:05:27.981 15:21:13 -- common/autotest_common.sh@650 -- # local es=0 00:05:27.981 15:21:13 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 1581781 /var/tmp/spdk2.sock 00:05:27.981 15:21:13 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:27.981 15:21:13 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:27.981 15:21:13 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:27.981 15:21:13 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:27.981 15:21:13 -- common/autotest_common.sh@653 -- # waitforlisten 1581781 /var/tmp/spdk2.sock 00:05:27.981 15:21:13 -- common/autotest_common.sh@829 -- # '[' -z 1581781 ']' 00:05:27.981 15:21:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:27.981 15:21:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:27.981 15:21:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:27.981 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:27.981 15:21:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:27.981 15:21:13 -- common/autotest_common.sh@10 -- # set +x 00:05:27.981 [2024-12-02 15:21:13.711906] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:27.981 [2024-12-02 15:21:13.711991] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1581781 ] 00:05:27.981 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.240 [2024-12-02 15:21:13.806875] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1581509 has claimed it. 00:05:28.240 [2024-12-02 15:21:13.806922] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:28.810 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (1581781) - No such process 00:05:28.810 ERROR: process (pid: 1581781) is no longer running 00:05:28.810 15:21:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:28.810 15:21:14 -- common/autotest_common.sh@862 -- # return 1 00:05:28.810 15:21:14 -- common/autotest_common.sh@653 -- # es=1 00:05:28.810 15:21:14 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:28.810 15:21:14 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:28.810 15:21:14 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:28.810 15:21:14 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:28.810 15:21:14 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:28.810 15:21:14 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:28.810 15:21:14 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:28.810 15:21:14 -- event/cpu_locks.sh@141 -- # killprocess 1581509 00:05:28.810 15:21:14 -- common/autotest_common.sh@936 -- # '[' -z 1581509 ']' 00:05:28.810 15:21:14 -- common/autotest_common.sh@940 -- # kill -0 1581509 00:05:28.810 15:21:14 -- common/autotest_common.sh@941 -- # uname 00:05:28.810 15:21:14 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:28.810 15:21:14 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1581509 00:05:28.810 15:21:14 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:28.810 15:21:14 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:28.810 15:21:14 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1581509' 00:05:28.810 killing process with pid 1581509 00:05:28.810 15:21:14 -- common/autotest_common.sh@955 -- # kill 1581509 00:05:28.810 15:21:14 -- common/autotest_common.sh@960 -- # wait 1581509 00:05:29.070 00:05:29.070 real 0m1.905s 00:05:29.070 user 0m5.381s 00:05:29.070 sys 0m0.455s 00:05:29.070 15:21:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.070 15:21:14 -- common/autotest_common.sh@10 -- # set +x 00:05:29.070 ************************************ 00:05:29.070 END TEST locking_overlapped_coremask 00:05:29.070 ************************************ 00:05:29.070 15:21:14 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:29.070 15:21:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:29.070 15:21:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:29.070 15:21:14 -- common/autotest_common.sh@10 -- # set +x 00:05:29.070 ************************************ 00:05:29.070 START TEST locking_overlapped_coremask_via_rpc 00:05:29.070 ************************************ 00:05:29.070 15:21:14 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:05:29.070 15:21:14 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1582010 00:05:29.070 15:21:14 -- event/cpu_locks.sh@149 -- # waitforlisten 1582010 /var/tmp/spdk.sock 00:05:29.070 15:21:14 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:29.070 15:21:14 -- common/autotest_common.sh@829 -- # '[' -z 1582010 ']' 00:05:29.070 15:21:14 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:29.070 15:21:14 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:29.070 15:21:14 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:29.070 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:29.070 15:21:14 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:29.070 15:21:14 -- common/autotest_common.sh@10 -- # set +x 00:05:29.070 [2024-12-02 15:21:14.811137] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:29.070 [2024-12-02 15:21:14.811227] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1582010 ] 00:05:29.070 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.070 [2024-12-02 15:21:14.880272] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:29.070 [2024-12-02 15:21:14.880297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:29.330 [2024-12-02 15:21:14.957192] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:29.330 [2024-12-02 15:21:14.957340] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:29.330 [2024-12-02 15:21:14.957454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.330 [2024-12-02 15:21:14.957454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:29.896 15:21:15 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:29.896 15:21:15 -- common/autotest_common.sh@862 -- # return 0 00:05:29.896 15:21:15 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1582091 00:05:29.896 15:21:15 -- event/cpu_locks.sh@153 -- # waitforlisten 1582091 /var/tmp/spdk2.sock 00:05:29.896 15:21:15 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:29.896 15:21:15 -- common/autotest_common.sh@829 -- # '[' -z 1582091 ']' 00:05:29.896 15:21:15 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:29.896 15:21:15 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:29.896 15:21:15 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:29.896 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:29.896 15:21:15 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:29.896 15:21:15 -- common/autotest_common.sh@10 -- # set +x 00:05:29.896 [2024-12-02 15:21:15.668808] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:29.896 [2024-12-02 15:21:15.668891] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1582091 ] 00:05:29.896 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.155 [2024-12-02 15:21:15.765513] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:30.155 [2024-12-02 15:21:15.765542] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:30.155 [2024-12-02 15:21:15.909231] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:30.155 [2024-12-02 15:21:15.909394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:30.155 [2024-12-02 15:21:15.912795] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:30.155 [2024-12-02 15:21:15.912797] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:30.724 15:21:16 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:30.724 15:21:16 -- common/autotest_common.sh@862 -- # return 0 00:05:30.724 15:21:16 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:30.724 15:21:16 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:30.724 15:21:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.724 15:21:16 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:30.724 15:21:16 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:30.724 15:21:16 -- common/autotest_common.sh@650 -- # local es=0 00:05:30.724 15:21:16 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:30.724 15:21:16 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:05:30.724 15:21:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:30.724 15:21:16 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:05:30.724 15:21:16 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:30.724 15:21:16 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:30.724 15:21:16 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:30.724 15:21:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.724 [2024-12-02 15:21:16.536800] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1582010 has claimed it. 00:05:30.984 request: 00:05:30.984 { 00:05:30.984 "method": "framework_enable_cpumask_locks", 00:05:30.984 "req_id": 1 00:05:30.984 } 00:05:30.984 Got JSON-RPC error response 00:05:30.984 response: 00:05:30.984 { 00:05:30.984 "code": -32603, 00:05:30.984 "message": "Failed to claim CPU core: 2" 00:05:30.984 } 00:05:30.984 15:21:16 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:05:30.984 15:21:16 -- common/autotest_common.sh@653 -- # es=1 00:05:30.984 15:21:16 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:30.984 15:21:16 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:30.984 15:21:16 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:30.984 15:21:16 -- event/cpu_locks.sh@158 -- # waitforlisten 1582010 /var/tmp/spdk.sock 00:05:30.984 15:21:16 -- common/autotest_common.sh@829 -- # '[' -z 1582010 ']' 00:05:30.984 15:21:16 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:30.984 15:21:16 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:30.984 15:21:16 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:30.984 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:30.984 15:21:16 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:30.984 15:21:16 -- common/autotest_common.sh@10 -- # set +x 00:05:30.984 15:21:16 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:30.984 15:21:16 -- common/autotest_common.sh@862 -- # return 0 00:05:30.984 15:21:16 -- event/cpu_locks.sh@159 -- # waitforlisten 1582091 /var/tmp/spdk2.sock 00:05:30.984 15:21:16 -- common/autotest_common.sh@829 -- # '[' -z 1582091 ']' 00:05:30.984 15:21:16 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:30.984 15:21:16 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:30.984 15:21:16 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:30.984 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:30.984 15:21:16 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:30.984 15:21:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.243 15:21:16 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:31.243 15:21:16 -- common/autotest_common.sh@862 -- # return 0 00:05:31.243 15:21:16 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:31.243 15:21:16 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:31.243 15:21:16 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:31.243 15:21:16 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:31.243 00:05:31.243 real 0m2.154s 00:05:31.243 user 0m0.866s 00:05:31.243 sys 0m0.216s 00:05:31.243 15:21:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:31.243 15:21:16 -- common/autotest_common.sh@10 -- # set +x 00:05:31.243 ************************************ 00:05:31.243 END TEST locking_overlapped_coremask_via_rpc 00:05:31.243 ************************************ 00:05:31.243 15:21:16 -- event/cpu_locks.sh@174 -- # cleanup 00:05:31.243 15:21:16 -- event/cpu_locks.sh@15 -- # [[ -z 1582010 ]] 00:05:31.243 15:21:16 -- event/cpu_locks.sh@15 -- # killprocess 1582010 00:05:31.243 15:21:16 -- common/autotest_common.sh@936 -- # '[' -z 1582010 ']' 00:05:31.243 15:21:16 -- common/autotest_common.sh@940 -- # kill -0 1582010 00:05:31.243 15:21:16 -- common/autotest_common.sh@941 -- # uname 00:05:31.243 15:21:16 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:31.243 15:21:16 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1582010 00:05:31.243 15:21:17 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:31.243 15:21:17 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:31.243 15:21:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1582010' 00:05:31.243 killing process with pid 1582010 00:05:31.243 15:21:17 -- common/autotest_common.sh@955 -- # kill 1582010 00:05:31.243 15:21:17 -- common/autotest_common.sh@960 -- # wait 1582010 00:05:31.812 15:21:17 -- event/cpu_locks.sh@16 -- # [[ -z 1582091 ]] 00:05:31.813 15:21:17 -- event/cpu_locks.sh@16 -- # killprocess 1582091 00:05:31.813 15:21:17 -- common/autotest_common.sh@936 -- # '[' -z 1582091 ']' 00:05:31.813 15:21:17 -- common/autotest_common.sh@940 -- # kill -0 1582091 00:05:31.813 15:21:17 -- common/autotest_common.sh@941 -- # uname 00:05:31.813 15:21:17 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:31.813 15:21:17 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1582091 00:05:31.813 15:21:17 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:31.813 15:21:17 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:31.813 15:21:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1582091' 00:05:31.813 killing process with pid 1582091 00:05:31.813 15:21:17 -- common/autotest_common.sh@955 -- # kill 1582091 00:05:31.813 15:21:17 -- common/autotest_common.sh@960 -- # wait 1582091 00:05:32.073 15:21:17 -- event/cpu_locks.sh@18 -- # rm -f 00:05:32.073 15:21:17 -- event/cpu_locks.sh@1 -- # cleanup 00:05:32.073 15:21:17 -- event/cpu_locks.sh@15 -- # [[ -z 1582010 ]] 00:05:32.073 15:21:17 -- event/cpu_locks.sh@15 -- # killprocess 1582010 00:05:32.073 15:21:17 -- common/autotest_common.sh@936 -- # '[' -z 1582010 ']' 00:05:32.073 15:21:17 -- common/autotest_common.sh@940 -- # kill -0 1582010 00:05:32.073 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1582010) - No such process 00:05:32.073 15:21:17 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1582010 is not found' 00:05:32.073 Process with pid 1582010 is not found 00:05:32.073 15:21:17 -- event/cpu_locks.sh@16 -- # [[ -z 1582091 ]] 00:05:32.073 15:21:17 -- event/cpu_locks.sh@16 -- # killprocess 1582091 00:05:32.073 15:21:17 -- common/autotest_common.sh@936 -- # '[' -z 1582091 ']' 00:05:32.073 15:21:17 -- common/autotest_common.sh@940 -- # kill -0 1582091 00:05:32.073 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (1582091) - No such process 00:05:32.073 15:21:17 -- common/autotest_common.sh@963 -- # echo 'Process with pid 1582091 is not found' 00:05:32.073 Process with pid 1582091 is not found 00:05:32.073 15:21:17 -- event/cpu_locks.sh@18 -- # rm -f 00:05:32.073 00:05:32.073 real 0m17.835s 00:05:32.073 user 0m30.549s 00:05:32.073 sys 0m5.559s 00:05:32.073 15:21:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.073 15:21:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.073 ************************************ 00:05:32.073 END TEST cpu_locks 00:05:32.073 ************************************ 00:05:32.073 00:05:32.073 real 0m43.498s 00:05:32.073 user 1m22.582s 00:05:32.073 sys 0m9.646s 00:05:32.073 15:21:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.073 15:21:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.073 ************************************ 00:05:32.073 END TEST event 00:05:32.073 ************************************ 00:05:32.073 15:21:17 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:32.073 15:21:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.073 15:21:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.073 15:21:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.073 ************************************ 00:05:32.073 START TEST thread 00:05:32.073 ************************************ 00:05:32.073 15:21:17 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:32.334 * Looking for test storage... 00:05:32.334 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:05:32.334 15:21:17 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:32.334 15:21:17 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:32.334 15:21:17 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:32.334 15:21:17 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:32.334 15:21:17 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:32.334 15:21:17 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:32.334 15:21:17 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:32.334 15:21:17 -- scripts/common.sh@335 -- # IFS=.-: 00:05:32.334 15:21:17 -- scripts/common.sh@335 -- # read -ra ver1 00:05:32.334 15:21:17 -- scripts/common.sh@336 -- # IFS=.-: 00:05:32.334 15:21:17 -- scripts/common.sh@336 -- # read -ra ver2 00:05:32.334 15:21:17 -- scripts/common.sh@337 -- # local 'op=<' 00:05:32.334 15:21:17 -- scripts/common.sh@339 -- # ver1_l=2 00:05:32.334 15:21:17 -- scripts/common.sh@340 -- # ver2_l=1 00:05:32.334 15:21:17 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:32.334 15:21:17 -- scripts/common.sh@343 -- # case "$op" in 00:05:32.334 15:21:17 -- scripts/common.sh@344 -- # : 1 00:05:32.334 15:21:17 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:32.334 15:21:17 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:32.334 15:21:17 -- scripts/common.sh@364 -- # decimal 1 00:05:32.334 15:21:17 -- scripts/common.sh@352 -- # local d=1 00:05:32.334 15:21:17 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:32.334 15:21:17 -- scripts/common.sh@354 -- # echo 1 00:05:32.334 15:21:17 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:32.334 15:21:17 -- scripts/common.sh@365 -- # decimal 2 00:05:32.334 15:21:17 -- scripts/common.sh@352 -- # local d=2 00:05:32.334 15:21:17 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:32.334 15:21:17 -- scripts/common.sh@354 -- # echo 2 00:05:32.334 15:21:17 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:32.334 15:21:17 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:32.334 15:21:17 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:32.334 15:21:17 -- scripts/common.sh@367 -- # return 0 00:05:32.334 15:21:17 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:32.334 15:21:17 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:32.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.334 --rc genhtml_branch_coverage=1 00:05:32.334 --rc genhtml_function_coverage=1 00:05:32.334 --rc genhtml_legend=1 00:05:32.334 --rc geninfo_all_blocks=1 00:05:32.334 --rc geninfo_unexecuted_blocks=1 00:05:32.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.334 ' 00:05:32.334 15:21:17 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:32.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.334 --rc genhtml_branch_coverage=1 00:05:32.334 --rc genhtml_function_coverage=1 00:05:32.334 --rc genhtml_legend=1 00:05:32.334 --rc geninfo_all_blocks=1 00:05:32.334 --rc geninfo_unexecuted_blocks=1 00:05:32.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.334 ' 00:05:32.334 15:21:17 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:32.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.334 --rc genhtml_branch_coverage=1 00:05:32.334 --rc genhtml_function_coverage=1 00:05:32.334 --rc genhtml_legend=1 00:05:32.334 --rc geninfo_all_blocks=1 00:05:32.334 --rc geninfo_unexecuted_blocks=1 00:05:32.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.334 ' 00:05:32.334 15:21:17 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:32.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:32.334 --rc genhtml_branch_coverage=1 00:05:32.334 --rc genhtml_function_coverage=1 00:05:32.334 --rc genhtml_legend=1 00:05:32.334 --rc geninfo_all_blocks=1 00:05:32.334 --rc geninfo_unexecuted_blocks=1 00:05:32.334 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:32.334 ' 00:05:32.334 15:21:17 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:32.334 15:21:17 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:32.334 15:21:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.334 15:21:17 -- common/autotest_common.sh@10 -- # set +x 00:05:32.334 ************************************ 00:05:32.334 START TEST thread_poller_perf 00:05:32.334 ************************************ 00:05:32.334 15:21:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:32.334 [2024-12-02 15:21:18.023126] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:32.334 [2024-12-02 15:21:18.023220] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1582722 ] 00:05:32.334 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.334 [2024-12-02 15:21:18.093956] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.594 [2024-12-02 15:21:18.164417] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.594 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:33.534 [2024-12-02T14:21:19.353Z] ====================================== 00:05:33.534 [2024-12-02T14:21:19.353Z] busy:2506280954 (cyc) 00:05:33.534 [2024-12-02T14:21:19.353Z] total_run_count: 799000 00:05:33.534 [2024-12-02T14:21:19.353Z] tsc_hz: 2500000000 (cyc) 00:05:33.534 [2024-12-02T14:21:19.353Z] ====================================== 00:05:33.534 [2024-12-02T14:21:19.353Z] poller_cost: 3136 (cyc), 1254 (nsec) 00:05:33.534 00:05:33.534 real 0m1.227s 00:05:33.534 user 0m1.136s 00:05:33.534 sys 0m0.087s 00:05:33.534 15:21:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:33.534 15:21:19 -- common/autotest_common.sh@10 -- # set +x 00:05:33.534 ************************************ 00:05:33.534 END TEST thread_poller_perf 00:05:33.534 ************************************ 00:05:33.534 15:21:19 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:33.534 15:21:19 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:33.534 15:21:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.534 15:21:19 -- common/autotest_common.sh@10 -- # set +x 00:05:33.534 ************************************ 00:05:33.534 START TEST thread_poller_perf 00:05:33.534 ************************************ 00:05:33.534 15:21:19 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:33.534 [2024-12-02 15:21:19.295262] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:33.534 [2024-12-02 15:21:19.295351] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1582900 ] 00:05:33.534 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.796 [2024-12-02 15:21:19.366715] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.796 [2024-12-02 15:21:19.436259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.796 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:34.736 [2024-12-02T14:21:20.555Z] ====================================== 00:05:34.736 [2024-12-02T14:21:20.555Z] busy:2502053180 (cyc) 00:05:34.736 [2024-12-02T14:21:20.555Z] total_run_count: 13374000 00:05:34.736 [2024-12-02T14:21:20.555Z] tsc_hz: 2500000000 (cyc) 00:05:34.736 [2024-12-02T14:21:20.555Z] ====================================== 00:05:34.736 [2024-12-02T14:21:20.555Z] poller_cost: 187 (cyc), 74 (nsec) 00:05:34.736 00:05:34.736 real 0m1.224s 00:05:34.736 user 0m1.131s 00:05:34.736 sys 0m0.089s 00:05:34.736 15:21:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.736 15:21:20 -- common/autotest_common.sh@10 -- # set +x 00:05:34.736 ************************************ 00:05:34.736 END TEST thread_poller_perf 00:05:34.736 ************************************ 00:05:34.736 15:21:20 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:05:34.736 15:21:20 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:34.736 15:21:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.736 15:21:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.736 15:21:20 -- common/autotest_common.sh@10 -- # set +x 00:05:34.736 ************************************ 00:05:34.736 START TEST thread_spdk_lock 00:05:34.736 ************************************ 00:05:34.736 15:21:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:34.996 [2024-12-02 15:21:20.570430] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:34.996 [2024-12-02 15:21:20.570520] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583064 ] 00:05:34.996 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.996 [2024-12-02 15:21:20.642389] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:34.996 [2024-12-02 15:21:20.711695] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:34.996 [2024-12-02 15:21:20.711697] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.565 [2024-12-02 15:21:21.204812] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:35.565 [2024-12-02 15:21:21.204850] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:05:35.565 [2024-12-02 15:21:21.204860] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1483c80 00:05:35.565 [2024-12-02 15:21:21.205765] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:35.565 [2024-12-02 15:21:21.205868] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:35.566 [2024-12-02 15:21:21.205898] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:35.566 Starting test contend 00:05:35.566 Worker Delay Wait us Hold us Total us 00:05:35.566 0 3 178735 186013 364749 00:05:35.566 1 5 93332 287031 380363 00:05:35.566 PASS test contend 00:05:35.566 Starting test hold_by_poller 00:05:35.566 PASS test hold_by_poller 00:05:35.566 Starting test hold_by_message 00:05:35.566 PASS test hold_by_message 00:05:35.566 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:05:35.566 100014 assertions passed 00:05:35.566 0 assertions failed 00:05:35.566 00:05:35.566 real 0m0.714s 00:05:35.566 user 0m1.110s 00:05:35.566 sys 0m0.094s 00:05:35.566 15:21:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.566 15:21:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.566 ************************************ 00:05:35.566 END TEST thread_spdk_lock 00:05:35.566 ************************************ 00:05:35.566 00:05:35.566 real 0m3.489s 00:05:35.566 user 0m3.522s 00:05:35.566 sys 0m0.491s 00:05:35.566 15:21:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.566 15:21:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.566 ************************************ 00:05:35.566 END TEST thread 00:05:35.566 ************************************ 00:05:35.566 15:21:21 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:35.566 15:21:21 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:35.566 15:21:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.566 15:21:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.566 ************************************ 00:05:35.566 START TEST accel 00:05:35.566 ************************************ 00:05:35.566 15:21:21 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:35.826 * Looking for test storage... 00:05:35.826 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:05:35.826 15:21:21 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:35.826 15:21:21 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:35.826 15:21:21 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:35.826 15:21:21 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:35.826 15:21:21 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:35.826 15:21:21 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:35.826 15:21:21 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:35.826 15:21:21 -- scripts/common.sh@335 -- # IFS=.-: 00:05:35.826 15:21:21 -- scripts/common.sh@335 -- # read -ra ver1 00:05:35.826 15:21:21 -- scripts/common.sh@336 -- # IFS=.-: 00:05:35.826 15:21:21 -- scripts/common.sh@336 -- # read -ra ver2 00:05:35.826 15:21:21 -- scripts/common.sh@337 -- # local 'op=<' 00:05:35.826 15:21:21 -- scripts/common.sh@339 -- # ver1_l=2 00:05:35.826 15:21:21 -- scripts/common.sh@340 -- # ver2_l=1 00:05:35.826 15:21:21 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:35.826 15:21:21 -- scripts/common.sh@343 -- # case "$op" in 00:05:35.826 15:21:21 -- scripts/common.sh@344 -- # : 1 00:05:35.826 15:21:21 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:35.826 15:21:21 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:35.826 15:21:21 -- scripts/common.sh@364 -- # decimal 1 00:05:35.826 15:21:21 -- scripts/common.sh@352 -- # local d=1 00:05:35.826 15:21:21 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:35.826 15:21:21 -- scripts/common.sh@354 -- # echo 1 00:05:35.826 15:21:21 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:35.826 15:21:21 -- scripts/common.sh@365 -- # decimal 2 00:05:35.826 15:21:21 -- scripts/common.sh@352 -- # local d=2 00:05:35.826 15:21:21 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:35.826 15:21:21 -- scripts/common.sh@354 -- # echo 2 00:05:35.826 15:21:21 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:35.826 15:21:21 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:35.826 15:21:21 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:35.826 15:21:21 -- scripts/common.sh@367 -- # return 0 00:05:35.826 15:21:21 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:35.826 15:21:21 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:35.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.826 --rc genhtml_branch_coverage=1 00:05:35.826 --rc genhtml_function_coverage=1 00:05:35.826 --rc genhtml_legend=1 00:05:35.826 --rc geninfo_all_blocks=1 00:05:35.826 --rc geninfo_unexecuted_blocks=1 00:05:35.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.826 ' 00:05:35.826 15:21:21 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:35.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.826 --rc genhtml_branch_coverage=1 00:05:35.826 --rc genhtml_function_coverage=1 00:05:35.826 --rc genhtml_legend=1 00:05:35.826 --rc geninfo_all_blocks=1 00:05:35.826 --rc geninfo_unexecuted_blocks=1 00:05:35.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.826 ' 00:05:35.826 15:21:21 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:35.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.826 --rc genhtml_branch_coverage=1 00:05:35.826 --rc genhtml_function_coverage=1 00:05:35.826 --rc genhtml_legend=1 00:05:35.826 --rc geninfo_all_blocks=1 00:05:35.826 --rc geninfo_unexecuted_blocks=1 00:05:35.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.826 ' 00:05:35.826 15:21:21 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:35.826 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:35.826 --rc genhtml_branch_coverage=1 00:05:35.826 --rc genhtml_function_coverage=1 00:05:35.826 --rc genhtml_legend=1 00:05:35.826 --rc geninfo_all_blocks=1 00:05:35.826 --rc geninfo_unexecuted_blocks=1 00:05:35.826 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:35.826 ' 00:05:35.826 15:21:21 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:05:35.826 15:21:21 -- accel/accel.sh@74 -- # get_expected_opcs 00:05:35.826 15:21:21 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:35.826 15:21:21 -- accel/accel.sh@59 -- # spdk_tgt_pid=1583369 00:05:35.826 15:21:21 -- accel/accel.sh@60 -- # waitforlisten 1583369 00:05:35.826 15:21:21 -- common/autotest_common.sh@829 -- # '[' -z 1583369 ']' 00:05:35.826 15:21:21 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.826 15:21:21 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:35.826 15:21:21 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:35.826 15:21:21 -- accel/accel.sh@58 -- # build_accel_config 00:05:35.826 15:21:21 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.826 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.826 15:21:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:35.826 15:21:21 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:35.826 15:21:21 -- common/autotest_common.sh@10 -- # set +x 00:05:35.826 15:21:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:35.826 15:21:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:35.826 15:21:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:35.826 15:21:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:35.826 15:21:21 -- accel/accel.sh@41 -- # local IFS=, 00:05:35.826 15:21:21 -- accel/accel.sh@42 -- # jq -r . 00:05:35.826 [2024-12-02 15:21:21.576715] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:35.826 [2024-12-02 15:21:21.576817] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583369 ] 00:05:35.826 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.086 [2024-12-02 15:21:21.645382] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.086 [2024-12-02 15:21:21.716039] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:36.086 [2024-12-02 15:21:21.716158] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.656 15:21:22 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:36.656 15:21:22 -- common/autotest_common.sh@862 -- # return 0 00:05:36.656 15:21:22 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:05:36.656 15:21:22 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:05:36.656 15:21:22 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:36.656 15:21:22 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:05:36.656 15:21:22 -- common/autotest_common.sh@10 -- # set +x 00:05:36.656 15:21:22 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # IFS== 00:05:36.656 15:21:22 -- accel/accel.sh@64 -- # read -r opc module 00:05:36.656 15:21:22 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:36.656 15:21:22 -- accel/accel.sh@67 -- # killprocess 1583369 00:05:36.656 15:21:22 -- common/autotest_common.sh@936 -- # '[' -z 1583369 ']' 00:05:36.656 15:21:22 -- common/autotest_common.sh@940 -- # kill -0 1583369 00:05:36.656 15:21:22 -- common/autotest_common.sh@941 -- # uname 00:05:36.656 15:21:22 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:36.656 15:21:22 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1583369 00:05:36.916 15:21:22 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:36.916 15:21:22 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:36.916 15:21:22 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1583369' 00:05:36.916 killing process with pid 1583369 00:05:36.916 15:21:22 -- common/autotest_common.sh@955 -- # kill 1583369 00:05:36.916 15:21:22 -- common/autotest_common.sh@960 -- # wait 1583369 00:05:37.175 15:21:22 -- accel/accel.sh@68 -- # trap - ERR 00:05:37.175 15:21:22 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:05:37.175 15:21:22 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:05:37.175 15:21:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.175 15:21:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.175 15:21:22 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:05:37.175 15:21:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:05:37.175 15:21:22 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.175 15:21:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.175 15:21:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.175 15:21:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.175 15:21:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.175 15:21:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.175 15:21:22 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.175 15:21:22 -- accel/accel.sh@42 -- # jq -r . 00:05:37.176 15:21:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.176 15:21:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.176 15:21:22 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:05:37.176 15:21:22 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:37.176 15:21:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.176 15:21:22 -- common/autotest_common.sh@10 -- # set +x 00:05:37.176 ************************************ 00:05:37.176 START TEST accel_missing_filename 00:05:37.176 ************************************ 00:05:37.176 15:21:22 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:05:37.176 15:21:22 -- common/autotest_common.sh@650 -- # local es=0 00:05:37.176 15:21:22 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:05:37.176 15:21:22 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:37.176 15:21:22 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.176 15:21:22 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:37.176 15:21:22 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.176 15:21:22 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:05:37.176 15:21:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:05:37.176 15:21:22 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.176 15:21:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.176 15:21:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.176 15:21:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.176 15:21:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.176 15:21:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.176 15:21:22 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.176 15:21:22 -- accel/accel.sh@42 -- # jq -r . 00:05:37.176 [2024-12-02 15:21:22.915940] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:37.176 [2024-12-02 15:21:22.916031] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583679 ] 00:05:37.176 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.176 [2024-12-02 15:21:22.987043] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.436 [2024-12-02 15:21:23.055615] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.436 [2024-12-02 15:21:23.094801] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:37.436 [2024-12-02 15:21:23.154516] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:37.436 A filename is required. 00:05:37.436 15:21:23 -- common/autotest_common.sh@653 -- # es=234 00:05:37.436 15:21:23 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:37.436 15:21:23 -- common/autotest_common.sh@662 -- # es=106 00:05:37.436 15:21:23 -- common/autotest_common.sh@663 -- # case "$es" in 00:05:37.436 15:21:23 -- common/autotest_common.sh@670 -- # es=1 00:05:37.436 15:21:23 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:37.436 00:05:37.436 real 0m0.330s 00:05:37.436 user 0m0.235s 00:05:37.436 sys 0m0.135s 00:05:37.436 15:21:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.436 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.436 ************************************ 00:05:37.436 END TEST accel_missing_filename 00:05:37.436 ************************************ 00:05:37.695 15:21:23 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:37.695 15:21:23 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:05:37.695 15:21:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.695 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.695 ************************************ 00:05:37.695 START TEST accel_compress_verify 00:05:37.695 ************************************ 00:05:37.695 15:21:23 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:37.695 15:21:23 -- common/autotest_common.sh@650 -- # local es=0 00:05:37.695 15:21:23 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:37.695 15:21:23 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:37.695 15:21:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.695 15:21:23 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:37.695 15:21:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.695 15:21:23 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:37.695 15:21:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:37.695 15:21:23 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.695 15:21:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.695 15:21:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.695 15:21:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.695 15:21:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.695 15:21:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.695 15:21:23 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.695 15:21:23 -- accel/accel.sh@42 -- # jq -r . 00:05:37.695 [2024-12-02 15:21:23.294119] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:37.695 [2024-12-02 15:21:23.294212] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583699 ] 00:05:37.695 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.695 [2024-12-02 15:21:23.365702] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.695 [2024-12-02 15:21:23.431586] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.695 [2024-12-02 15:21:23.470984] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:37.955 [2024-12-02 15:21:23.531021] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:37.955 00:05:37.955 Compression does not support the verify option, aborting. 00:05:37.955 15:21:23 -- common/autotest_common.sh@653 -- # es=161 00:05:37.955 15:21:23 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:37.955 15:21:23 -- common/autotest_common.sh@662 -- # es=33 00:05:37.955 15:21:23 -- common/autotest_common.sh@663 -- # case "$es" in 00:05:37.955 15:21:23 -- common/autotest_common.sh@670 -- # es=1 00:05:37.955 15:21:23 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:37.955 00:05:37.955 real 0m0.329s 00:05:37.955 user 0m0.233s 00:05:37.955 sys 0m0.130s 00:05:37.955 15:21:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.955 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.955 ************************************ 00:05:37.955 END TEST accel_compress_verify 00:05:37.955 ************************************ 00:05:37.955 15:21:23 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:05:37.955 15:21:23 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:37.955 15:21:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.955 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.955 ************************************ 00:05:37.955 START TEST accel_wrong_workload 00:05:37.955 ************************************ 00:05:37.955 15:21:23 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:05:37.955 15:21:23 -- common/autotest_common.sh@650 -- # local es=0 00:05:37.955 15:21:23 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:05:37.955 15:21:23 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:37.955 15:21:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.955 15:21:23 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:37.955 15:21:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.955 15:21:23 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:05:37.955 15:21:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:05:37.955 15:21:23 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.955 15:21:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.955 15:21:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.955 15:21:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.955 15:21:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.955 15:21:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.955 15:21:23 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.955 15:21:23 -- accel/accel.sh@42 -- # jq -r . 00:05:37.955 Unsupported workload type: foobar 00:05:37.955 [2024-12-02 15:21:23.665712] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:05:37.955 accel_perf options: 00:05:37.955 [-h help message] 00:05:37.955 [-q queue depth per core] 00:05:37.955 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:37.955 [-T number of threads per core 00:05:37.955 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:37.955 [-t time in seconds] 00:05:37.955 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:37.956 [ dif_verify, , dif_generate, dif_generate_copy 00:05:37.956 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:37.956 [-l for compress/decompress workloads, name of uncompressed input file 00:05:37.956 [-S for crc32c workload, use this seed value (default 0) 00:05:37.956 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:37.956 [-f for fill workload, use this BYTE value (default 255) 00:05:37.956 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:37.956 [-y verify result if this switch is on] 00:05:37.956 [-a tasks to allocate per core (default: same value as -q)] 00:05:37.956 Can be used to spread operations across a wider range of memory. 00:05:37.956 15:21:23 -- common/autotest_common.sh@653 -- # es=1 00:05:37.956 15:21:23 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:37.956 15:21:23 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:37.956 15:21:23 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:37.956 00:05:37.956 real 0m0.029s 00:05:37.956 user 0m0.012s 00:05:37.956 sys 0m0.017s 00:05:37.956 15:21:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.956 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.956 ************************************ 00:05:37.956 END TEST accel_wrong_workload 00:05:37.956 ************************************ 00:05:37.956 Error: writing output failed: Broken pipe 00:05:37.956 15:21:23 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:05:37.956 15:21:23 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:05:37.956 15:21:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.956 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.956 ************************************ 00:05:37.956 START TEST accel_negative_buffers 00:05:37.956 ************************************ 00:05:37.956 15:21:23 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:05:37.956 15:21:23 -- common/autotest_common.sh@650 -- # local es=0 00:05:37.956 15:21:23 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:05:37.956 15:21:23 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:37.956 15:21:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.956 15:21:23 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:37.956 15:21:23 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:37.956 15:21:23 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:05:37.956 15:21:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:05:37.956 15:21:23 -- accel/accel.sh@12 -- # build_accel_config 00:05:37.956 15:21:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:37.956 15:21:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:37.956 15:21:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:37.956 15:21:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:37.956 15:21:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:37.956 15:21:23 -- accel/accel.sh@41 -- # local IFS=, 00:05:37.956 15:21:23 -- accel/accel.sh@42 -- # jq -r . 00:05:37.956 -x option must be non-negative. 00:05:37.956 [2024-12-02 15:21:23.741055] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:05:37.956 accel_perf options: 00:05:37.956 [-h help message] 00:05:37.956 [-q queue depth per core] 00:05:37.956 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:37.956 [-T number of threads per core 00:05:37.956 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:37.956 [-t time in seconds] 00:05:37.956 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:37.956 [ dif_verify, , dif_generate, dif_generate_copy 00:05:37.956 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:37.956 [-l for compress/decompress workloads, name of uncompressed input file 00:05:37.956 [-S for crc32c workload, use this seed value (default 0) 00:05:37.956 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:37.956 [-f for fill workload, use this BYTE value (default 255) 00:05:37.956 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:37.956 [-y verify result if this switch is on] 00:05:37.956 [-a tasks to allocate per core (default: same value as -q)] 00:05:37.956 Can be used to spread operations across a wider range of memory. 00:05:37.956 15:21:23 -- common/autotest_common.sh@653 -- # es=1 00:05:37.956 15:21:23 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:37.956 15:21:23 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:37.956 15:21:23 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:37.956 00:05:37.956 real 0m0.029s 00:05:37.956 user 0m0.014s 00:05:37.956 sys 0m0.016s 00:05:37.956 15:21:23 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.956 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:37.956 ************************************ 00:05:37.956 END TEST accel_negative_buffers 00:05:37.956 ************************************ 00:05:37.956 Error: writing output failed: Broken pipe 00:05:38.215 15:21:23 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:05:38.216 15:21:23 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:38.216 15:21:23 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.216 15:21:23 -- common/autotest_common.sh@10 -- # set +x 00:05:38.216 ************************************ 00:05:38.216 START TEST accel_crc32c 00:05:38.216 ************************************ 00:05:38.216 15:21:23 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:05:38.216 15:21:23 -- accel/accel.sh@16 -- # local accel_opc 00:05:38.216 15:21:23 -- accel/accel.sh@17 -- # local accel_module 00:05:38.216 15:21:23 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:38.216 15:21:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:38.216 15:21:23 -- accel/accel.sh@12 -- # build_accel_config 00:05:38.216 15:21:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:38.216 15:21:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:38.216 15:21:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:38.216 15:21:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:38.216 15:21:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:38.216 15:21:23 -- accel/accel.sh@41 -- # local IFS=, 00:05:38.216 15:21:23 -- accel/accel.sh@42 -- # jq -r . 00:05:38.216 [2024-12-02 15:21:23.810464] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:38.216 [2024-12-02 15:21:23.810543] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583873 ] 00:05:38.216 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.216 [2024-12-02 15:21:23.879700] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.216 [2024-12-02 15:21:23.948770] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.594 15:21:25 -- accel/accel.sh@18 -- # out=' 00:05:39.594 SPDK Configuration: 00:05:39.594 Core mask: 0x1 00:05:39.594 00:05:39.594 Accel Perf Configuration: 00:05:39.594 Workload Type: crc32c 00:05:39.594 CRC-32C seed: 32 00:05:39.594 Transfer size: 4096 bytes 00:05:39.594 Vector count 1 00:05:39.594 Module: software 00:05:39.594 Queue depth: 32 00:05:39.594 Allocate depth: 32 00:05:39.594 # threads/core: 1 00:05:39.594 Run time: 1 seconds 00:05:39.594 Verify: Yes 00:05:39.594 00:05:39.594 Running for 1 seconds... 00:05:39.594 00:05:39.594 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:39.594 ------------------------------------------------------------------------------------ 00:05:39.594 0,0 854464/s 3337 MiB/s 0 0 00:05:39.594 ==================================================================================== 00:05:39.594 Total 854464/s 3337 MiB/s 0 0' 00:05:39.594 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.594 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.594 15:21:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:39.595 15:21:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:39.595 15:21:25 -- accel/accel.sh@12 -- # build_accel_config 00:05:39.595 15:21:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:39.595 15:21:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:39.595 15:21:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:39.595 15:21:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:39.595 15:21:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:39.595 15:21:25 -- accel/accel.sh@41 -- # local IFS=, 00:05:39.595 15:21:25 -- accel/accel.sh@42 -- # jq -r . 00:05:39.595 [2024-12-02 15:21:25.141215] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:39.595 [2024-12-02 15:21:25.141311] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1584043 ] 00:05:39.595 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.595 [2024-12-02 15:21:25.212129] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.595 [2024-12-02 15:21:25.278274] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=0x1 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=crc32c 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=32 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=software 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@23 -- # accel_module=software 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=32 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=32 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=1 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val=Yes 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:39.595 15:21:25 -- accel/accel.sh@21 -- # val= 00:05:39.595 15:21:25 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # IFS=: 00:05:39.595 15:21:25 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@21 -- # val= 00:05:40.974 15:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # IFS=: 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@21 -- # val= 00:05:40.974 15:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # IFS=: 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@21 -- # val= 00:05:40.974 15:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # IFS=: 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@21 -- # val= 00:05:40.974 15:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # IFS=: 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@21 -- # val= 00:05:40.974 15:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # IFS=: 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@21 -- # val= 00:05:40.974 15:21:26 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # IFS=: 00:05:40.974 15:21:26 -- accel/accel.sh@20 -- # read -r var val 00:05:40.974 15:21:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:40.974 15:21:26 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:40.974 15:21:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:40.974 00:05:40.974 real 0m2.663s 00:05:40.974 user 0m2.425s 00:05:40.974 sys 0m0.245s 00:05:40.974 15:21:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.974 15:21:26 -- common/autotest_common.sh@10 -- # set +x 00:05:40.974 ************************************ 00:05:40.974 END TEST accel_crc32c 00:05:40.974 ************************************ 00:05:40.974 15:21:26 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:05:40.974 15:21:26 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:40.974 15:21:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.974 15:21:26 -- common/autotest_common.sh@10 -- # set +x 00:05:40.974 ************************************ 00:05:40.974 START TEST accel_crc32c_C2 00:05:40.974 ************************************ 00:05:40.974 15:21:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:05:40.974 15:21:26 -- accel/accel.sh@16 -- # local accel_opc 00:05:40.974 15:21:26 -- accel/accel.sh@17 -- # local accel_module 00:05:40.974 15:21:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:40.974 15:21:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:40.974 15:21:26 -- accel/accel.sh@12 -- # build_accel_config 00:05:40.974 15:21:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:40.974 15:21:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:40.974 15:21:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:40.974 15:21:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:40.974 15:21:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:40.974 15:21:26 -- accel/accel.sh@41 -- # local IFS=, 00:05:40.974 15:21:26 -- accel/accel.sh@42 -- # jq -r . 00:05:40.974 [2024-12-02 15:21:26.522809] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:40.974 [2024-12-02 15:21:26.522898] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1584312 ] 00:05:40.974 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.974 [2024-12-02 15:21:26.592208] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.974 [2024-12-02 15:21:26.658894] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.355 15:21:27 -- accel/accel.sh@18 -- # out=' 00:05:42.355 SPDK Configuration: 00:05:42.355 Core mask: 0x1 00:05:42.355 00:05:42.355 Accel Perf Configuration: 00:05:42.355 Workload Type: crc32c 00:05:42.355 CRC-32C seed: 0 00:05:42.355 Transfer size: 4096 bytes 00:05:42.355 Vector count 2 00:05:42.355 Module: software 00:05:42.355 Queue depth: 32 00:05:42.355 Allocate depth: 32 00:05:42.355 # threads/core: 1 00:05:42.355 Run time: 1 seconds 00:05:42.355 Verify: Yes 00:05:42.355 00:05:42.355 Running for 1 seconds... 00:05:42.355 00:05:42.355 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:42.355 ------------------------------------------------------------------------------------ 00:05:42.355 0,0 618560/s 4832 MiB/s 0 0 00:05:42.355 ==================================================================================== 00:05:42.355 Total 618560/s 2416 MiB/s 0 0' 00:05:42.355 15:21:27 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:27 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:42.355 15:21:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:42.355 15:21:27 -- accel/accel.sh@12 -- # build_accel_config 00:05:42.355 15:21:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:42.355 15:21:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:42.355 15:21:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:42.355 15:21:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:42.355 15:21:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:42.355 15:21:27 -- accel/accel.sh@41 -- # local IFS=, 00:05:42.355 15:21:27 -- accel/accel.sh@42 -- # jq -r . 00:05:42.355 [2024-12-02 15:21:27.848906] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:42.355 [2024-12-02 15:21:27.848999] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1584580 ] 00:05:42.355 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.355 [2024-12-02 15:21:27.919469] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.355 [2024-12-02 15:21:27.988448] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=0x1 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=crc32c 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=0 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=software 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@23 -- # accel_module=software 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=32 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=32 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=1 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val=Yes 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:42.355 15:21:28 -- accel/accel.sh@21 -- # val= 00:05:42.355 15:21:28 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # IFS=: 00:05:42.355 15:21:28 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@21 -- # val= 00:05:43.732 15:21:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # IFS=: 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@21 -- # val= 00:05:43.732 15:21:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # IFS=: 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@21 -- # val= 00:05:43.732 15:21:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # IFS=: 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@21 -- # val= 00:05:43.732 15:21:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # IFS=: 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@21 -- # val= 00:05:43.732 15:21:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # IFS=: 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@21 -- # val= 00:05:43.732 15:21:29 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # IFS=: 00:05:43.732 15:21:29 -- accel/accel.sh@20 -- # read -r var val 00:05:43.732 15:21:29 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:43.732 15:21:29 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:43.732 15:21:29 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:43.732 00:05:43.732 real 0m2.662s 00:05:43.732 user 0m2.413s 00:05:43.732 sys 0m0.256s 00:05:43.732 15:21:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.732 15:21:29 -- common/autotest_common.sh@10 -- # set +x 00:05:43.732 ************************************ 00:05:43.732 END TEST accel_crc32c_C2 00:05:43.732 ************************************ 00:05:43.732 15:21:29 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:05:43.732 15:21:29 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:43.732 15:21:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.732 15:21:29 -- common/autotest_common.sh@10 -- # set +x 00:05:43.732 ************************************ 00:05:43.732 START TEST accel_copy 00:05:43.732 ************************************ 00:05:43.732 15:21:29 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:05:43.732 15:21:29 -- accel/accel.sh@16 -- # local accel_opc 00:05:43.732 15:21:29 -- accel/accel.sh@17 -- # local accel_module 00:05:43.732 15:21:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:05:43.732 15:21:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:43.732 15:21:29 -- accel/accel.sh@12 -- # build_accel_config 00:05:43.732 15:21:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:43.732 15:21:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:43.732 15:21:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:43.732 15:21:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:43.732 15:21:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:43.733 15:21:29 -- accel/accel.sh@41 -- # local IFS=, 00:05:43.733 15:21:29 -- accel/accel.sh@42 -- # jq -r . 00:05:43.733 [2024-12-02 15:21:29.233127] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:43.733 [2024-12-02 15:21:29.233216] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1584871 ] 00:05:43.733 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.733 [2024-12-02 15:21:29.302813] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.733 [2024-12-02 15:21:29.369860] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.111 15:21:30 -- accel/accel.sh@18 -- # out=' 00:05:45.111 SPDK Configuration: 00:05:45.111 Core mask: 0x1 00:05:45.111 00:05:45.111 Accel Perf Configuration: 00:05:45.111 Workload Type: copy 00:05:45.111 Transfer size: 4096 bytes 00:05:45.111 Vector count 1 00:05:45.111 Module: software 00:05:45.111 Queue depth: 32 00:05:45.111 Allocate depth: 32 00:05:45.111 # threads/core: 1 00:05:45.111 Run time: 1 seconds 00:05:45.111 Verify: Yes 00:05:45.111 00:05:45.111 Running for 1 seconds... 00:05:45.111 00:05:45.111 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:45.111 ------------------------------------------------------------------------------------ 00:05:45.111 0,0 543776/s 2124 MiB/s 0 0 00:05:45.111 ==================================================================================== 00:05:45.111 Total 543776/s 2124 MiB/s 0 0' 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:05:45.111 15:21:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:45.111 15:21:30 -- accel/accel.sh@12 -- # build_accel_config 00:05:45.111 15:21:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:45.111 15:21:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:45.111 15:21:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:45.111 15:21:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:45.111 15:21:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:45.111 15:21:30 -- accel/accel.sh@41 -- # local IFS=, 00:05:45.111 15:21:30 -- accel/accel.sh@42 -- # jq -r . 00:05:45.111 [2024-12-02 15:21:30.558587] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:45.111 [2024-12-02 15:21:30.558677] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1585139 ] 00:05:45.111 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.111 [2024-12-02 15:21:30.630337] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:45.111 [2024-12-02 15:21:30.697046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val=0x1 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val=copy 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@24 -- # accel_opc=copy 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val=software 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@23 -- # accel_module=software 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val=32 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.111 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.111 15:21:30 -- accel/accel.sh@21 -- # val=32 00:05:45.111 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.112 15:21:30 -- accel/accel.sh@21 -- # val=1 00:05:45.112 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.112 15:21:30 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:45.112 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.112 15:21:30 -- accel/accel.sh@21 -- # val=Yes 00:05:45.112 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.112 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.112 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:45.112 15:21:30 -- accel/accel.sh@21 -- # val= 00:05:45.112 15:21:30 -- accel/accel.sh@22 -- # case "$var" in 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # IFS=: 00:05:45.112 15:21:30 -- accel/accel.sh@20 -- # read -r var val 00:05:46.050 15:21:31 -- accel/accel.sh@21 -- # val= 00:05:46.050 15:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # IFS=: 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # read -r var val 00:05:46.050 15:21:31 -- accel/accel.sh@21 -- # val= 00:05:46.050 15:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # IFS=: 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # read -r var val 00:05:46.050 15:21:31 -- accel/accel.sh@21 -- # val= 00:05:46.050 15:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # IFS=: 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # read -r var val 00:05:46.050 15:21:31 -- accel/accel.sh@21 -- # val= 00:05:46.050 15:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # IFS=: 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # read -r var val 00:05:46.050 15:21:31 -- accel/accel.sh@21 -- # val= 00:05:46.050 15:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # IFS=: 00:05:46.050 15:21:31 -- accel/accel.sh@20 -- # read -r var val 00:05:46.050 15:21:31 -- accel/accel.sh@21 -- # val= 00:05:46.310 15:21:31 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.310 15:21:31 -- accel/accel.sh@20 -- # IFS=: 00:05:46.310 15:21:31 -- accel/accel.sh@20 -- # read -r var val 00:05:46.310 15:21:31 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:46.310 15:21:31 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:05:46.310 15:21:31 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:46.310 00:05:46.310 real 0m2.659s 00:05:46.310 user 0m2.414s 00:05:46.310 sys 0m0.254s 00:05:46.310 15:21:31 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.310 15:21:31 -- common/autotest_common.sh@10 -- # set +x 00:05:46.310 ************************************ 00:05:46.310 END TEST accel_copy 00:05:46.310 ************************************ 00:05:46.310 15:21:31 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:46.310 15:21:31 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:05:46.310 15:21:31 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.310 15:21:31 -- common/autotest_common.sh@10 -- # set +x 00:05:46.310 ************************************ 00:05:46.310 START TEST accel_fill 00:05:46.310 ************************************ 00:05:46.310 15:21:31 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:46.310 15:21:31 -- accel/accel.sh@16 -- # local accel_opc 00:05:46.310 15:21:31 -- accel/accel.sh@17 -- # local accel_module 00:05:46.310 15:21:31 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:46.310 15:21:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:46.310 15:21:31 -- accel/accel.sh@12 -- # build_accel_config 00:05:46.310 15:21:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:46.310 15:21:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:46.310 15:21:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:46.310 15:21:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:46.310 15:21:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:46.310 15:21:31 -- accel/accel.sh@41 -- # local IFS=, 00:05:46.310 15:21:31 -- accel/accel.sh@42 -- # jq -r . 00:05:46.310 [2024-12-02 15:21:31.941654] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:46.310 [2024-12-02 15:21:31.941751] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1585426 ] 00:05:46.310 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.310 [2024-12-02 15:21:32.013299] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.310 [2024-12-02 15:21:32.079019] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.690 15:21:33 -- accel/accel.sh@18 -- # out=' 00:05:47.691 SPDK Configuration: 00:05:47.691 Core mask: 0x1 00:05:47.691 00:05:47.691 Accel Perf Configuration: 00:05:47.691 Workload Type: fill 00:05:47.691 Fill pattern: 0x80 00:05:47.691 Transfer size: 4096 bytes 00:05:47.691 Vector count 1 00:05:47.691 Module: software 00:05:47.691 Queue depth: 64 00:05:47.691 Allocate depth: 64 00:05:47.691 # threads/core: 1 00:05:47.691 Run time: 1 seconds 00:05:47.691 Verify: Yes 00:05:47.691 00:05:47.691 Running for 1 seconds... 00:05:47.691 00:05:47.691 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:47.691 ------------------------------------------------------------------------------------ 00:05:47.691 0,0 959808/s 3749 MiB/s 0 0 00:05:47.691 ==================================================================================== 00:05:47.691 Total 959808/s 3749 MiB/s 0 0' 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:47.691 15:21:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:47.691 15:21:33 -- accel/accel.sh@12 -- # build_accel_config 00:05:47.691 15:21:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:47.691 15:21:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:47.691 15:21:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:47.691 15:21:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:47.691 15:21:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:47.691 15:21:33 -- accel/accel.sh@41 -- # local IFS=, 00:05:47.691 15:21:33 -- accel/accel.sh@42 -- # jq -r . 00:05:47.691 [2024-12-02 15:21:33.268913] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:47.691 [2024-12-02 15:21:33.269005] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1585698 ] 00:05:47.691 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.691 [2024-12-02 15:21:33.337279] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.691 [2024-12-02 15:21:33.403103] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=0x1 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=fill 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@24 -- # accel_opc=fill 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=0x80 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=software 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@23 -- # accel_module=software 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=64 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=64 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=1 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val=Yes 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:47.691 15:21:33 -- accel/accel.sh@21 -- # val= 00:05:47.691 15:21:33 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # IFS=: 00:05:47.691 15:21:33 -- accel/accel.sh@20 -- # read -r var val 00:05:49.070 15:21:34 -- accel/accel.sh@21 -- # val= 00:05:49.070 15:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.070 15:21:34 -- accel/accel.sh@20 -- # IFS=: 00:05:49.070 15:21:34 -- accel/accel.sh@20 -- # read -r var val 00:05:49.070 15:21:34 -- accel/accel.sh@21 -- # val= 00:05:49.070 15:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # IFS=: 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # read -r var val 00:05:49.071 15:21:34 -- accel/accel.sh@21 -- # val= 00:05:49.071 15:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # IFS=: 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # read -r var val 00:05:49.071 15:21:34 -- accel/accel.sh@21 -- # val= 00:05:49.071 15:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # IFS=: 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # read -r var val 00:05:49.071 15:21:34 -- accel/accel.sh@21 -- # val= 00:05:49.071 15:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # IFS=: 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # read -r var val 00:05:49.071 15:21:34 -- accel/accel.sh@21 -- # val= 00:05:49.071 15:21:34 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # IFS=: 00:05:49.071 15:21:34 -- accel/accel.sh@20 -- # read -r var val 00:05:49.071 15:21:34 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:49.071 15:21:34 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:05:49.071 15:21:34 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:49.071 00:05:49.071 real 0m2.660s 00:05:49.071 user 0m2.410s 00:05:49.071 sys 0m0.257s 00:05:49.071 15:21:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:49.071 15:21:34 -- common/autotest_common.sh@10 -- # set +x 00:05:49.071 ************************************ 00:05:49.071 END TEST accel_fill 00:05:49.071 ************************************ 00:05:49.071 15:21:34 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:05:49.071 15:21:34 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:49.071 15:21:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:49.071 15:21:34 -- common/autotest_common.sh@10 -- # set +x 00:05:49.071 ************************************ 00:05:49.071 START TEST accel_copy_crc32c 00:05:49.071 ************************************ 00:05:49.071 15:21:34 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:05:49.071 15:21:34 -- accel/accel.sh@16 -- # local accel_opc 00:05:49.071 15:21:34 -- accel/accel.sh@17 -- # local accel_module 00:05:49.071 15:21:34 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:49.071 15:21:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:49.071 15:21:34 -- accel/accel.sh@12 -- # build_accel_config 00:05:49.071 15:21:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:49.071 15:21:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:49.071 15:21:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:49.071 15:21:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:49.071 15:21:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:49.071 15:21:34 -- accel/accel.sh@41 -- # local IFS=, 00:05:49.071 15:21:34 -- accel/accel.sh@42 -- # jq -r . 00:05:49.071 [2024-12-02 15:21:34.650355] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:49.071 [2024-12-02 15:21:34.650444] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1585916 ] 00:05:49.071 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.071 [2024-12-02 15:21:34.722415] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.071 [2024-12-02 15:21:34.791028] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.451 15:21:35 -- accel/accel.sh@18 -- # out=' 00:05:50.451 SPDK Configuration: 00:05:50.451 Core mask: 0x1 00:05:50.451 00:05:50.451 Accel Perf Configuration: 00:05:50.451 Workload Type: copy_crc32c 00:05:50.451 CRC-32C seed: 0 00:05:50.451 Vector size: 4096 bytes 00:05:50.451 Transfer size: 4096 bytes 00:05:50.451 Vector count 1 00:05:50.451 Module: software 00:05:50.451 Queue depth: 32 00:05:50.451 Allocate depth: 32 00:05:50.451 # threads/core: 1 00:05:50.451 Run time: 1 seconds 00:05:50.451 Verify: Yes 00:05:50.451 00:05:50.451 Running for 1 seconds... 00:05:50.451 00:05:50.451 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:50.451 ------------------------------------------------------------------------------------ 00:05:50.451 0,0 432864/s 1690 MiB/s 0 0 00:05:50.451 ==================================================================================== 00:05:50.451 Total 432864/s 1690 MiB/s 0 0' 00:05:50.451 15:21:35 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:35 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:35 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:50.451 15:21:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:50.451 15:21:35 -- accel/accel.sh@12 -- # build_accel_config 00:05:50.451 15:21:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:50.451 15:21:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:50.451 15:21:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:50.451 15:21:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:50.451 15:21:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:50.451 15:21:35 -- accel/accel.sh@41 -- # local IFS=, 00:05:50.451 15:21:35 -- accel/accel.sh@42 -- # jq -r . 00:05:50.451 [2024-12-02 15:21:35.981038] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:50.451 [2024-12-02 15:21:35.981136] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1586083 ] 00:05:50.451 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.451 [2024-12-02 15:21:36.050602] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.451 [2024-12-02 15:21:36.117645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=0x1 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=copy_crc32c 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=0 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=software 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@23 -- # accel_module=software 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=32 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=32 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.451 15:21:36 -- accel/accel.sh@21 -- # val=1 00:05:50.451 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.451 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.452 15:21:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:50.452 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.452 15:21:36 -- accel/accel.sh@21 -- # val=Yes 00:05:50.452 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.452 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.452 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:50.452 15:21:36 -- accel/accel.sh@21 -- # val= 00:05:50.452 15:21:36 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # IFS=: 00:05:50.452 15:21:36 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@21 -- # val= 00:05:51.841 15:21:37 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # IFS=: 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@21 -- # val= 00:05:51.841 15:21:37 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # IFS=: 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@21 -- # val= 00:05:51.841 15:21:37 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # IFS=: 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@21 -- # val= 00:05:51.841 15:21:37 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # IFS=: 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@21 -- # val= 00:05:51.841 15:21:37 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # IFS=: 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@21 -- # val= 00:05:51.841 15:21:37 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # IFS=: 00:05:51.841 15:21:37 -- accel/accel.sh@20 -- # read -r var val 00:05:51.841 15:21:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:51.841 15:21:37 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:05:51.841 15:21:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:51.841 00:05:51.841 real 0m2.665s 00:05:51.841 user 0m2.415s 00:05:51.841 sys 0m0.260s 00:05:51.841 15:21:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.841 15:21:37 -- common/autotest_common.sh@10 -- # set +x 00:05:51.841 ************************************ 00:05:51.841 END TEST accel_copy_crc32c 00:05:51.841 ************************************ 00:05:51.841 15:21:37 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:05:51.841 15:21:37 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:51.841 15:21:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.841 15:21:37 -- common/autotest_common.sh@10 -- # set +x 00:05:51.841 ************************************ 00:05:51.841 START TEST accel_copy_crc32c_C2 00:05:51.841 ************************************ 00:05:51.841 15:21:37 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:05:51.841 15:21:37 -- accel/accel.sh@16 -- # local accel_opc 00:05:51.841 15:21:37 -- accel/accel.sh@17 -- # local accel_module 00:05:51.841 15:21:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:51.841 15:21:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:51.841 15:21:37 -- accel/accel.sh@12 -- # build_accel_config 00:05:51.841 15:21:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:51.841 15:21:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:51.841 15:21:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:51.841 15:21:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:51.841 15:21:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:51.841 15:21:37 -- accel/accel.sh@41 -- # local IFS=, 00:05:51.841 15:21:37 -- accel/accel.sh@42 -- # jq -r . 00:05:51.841 [2024-12-02 15:21:37.363173] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:51.841 [2024-12-02 15:21:37.363264] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1586300 ] 00:05:51.841 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.841 [2024-12-02 15:21:37.435795] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.841 [2024-12-02 15:21:37.509136] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.222 15:21:38 -- accel/accel.sh@18 -- # out=' 00:05:53.222 SPDK Configuration: 00:05:53.222 Core mask: 0x1 00:05:53.222 00:05:53.222 Accel Perf Configuration: 00:05:53.222 Workload Type: copy_crc32c 00:05:53.222 CRC-32C seed: 0 00:05:53.222 Vector size: 4096 bytes 00:05:53.222 Transfer size: 8192 bytes 00:05:53.222 Vector count 2 00:05:53.222 Module: software 00:05:53.222 Queue depth: 32 00:05:53.222 Allocate depth: 32 00:05:53.222 # threads/core: 1 00:05:53.222 Run time: 1 seconds 00:05:53.222 Verify: Yes 00:05:53.222 00:05:53.222 Running for 1 seconds... 00:05:53.222 00:05:53.222 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:53.222 ------------------------------------------------------------------------------------ 00:05:53.222 0,0 297280/s 2322 MiB/s 0 0 00:05:53.222 ==================================================================================== 00:05:53.222 Total 297280/s 1161 MiB/s 0 0' 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:53.222 15:21:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:53.222 15:21:38 -- accel/accel.sh@12 -- # build_accel_config 00:05:53.222 15:21:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:53.222 15:21:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:53.222 15:21:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:53.222 15:21:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:53.222 15:21:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:53.222 15:21:38 -- accel/accel.sh@41 -- # local IFS=, 00:05:53.222 15:21:38 -- accel/accel.sh@42 -- # jq -r . 00:05:53.222 [2024-12-02 15:21:38.700195] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:53.222 [2024-12-02 15:21:38.700285] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1586573 ] 00:05:53.222 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.222 [2024-12-02 15:21:38.770645] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.222 [2024-12-02 15:21:38.833590] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=0x1 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=copy_crc32c 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=0 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val='8192 bytes' 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=software 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@23 -- # accel_module=software 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=32 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=32 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=1 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val=Yes 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:53.222 15:21:38 -- accel/accel.sh@21 -- # val= 00:05:53.222 15:21:38 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # IFS=: 00:05:53.222 15:21:38 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@21 -- # val= 00:05:54.602 15:21:40 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # IFS=: 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@21 -- # val= 00:05:54.602 15:21:40 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # IFS=: 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@21 -- # val= 00:05:54.602 15:21:40 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # IFS=: 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@21 -- # val= 00:05:54.602 15:21:40 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # IFS=: 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@21 -- # val= 00:05:54.602 15:21:40 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # IFS=: 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@21 -- # val= 00:05:54.602 15:21:40 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # IFS=: 00:05:54.602 15:21:40 -- accel/accel.sh@20 -- # read -r var val 00:05:54.602 15:21:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:54.602 15:21:40 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:05:54.602 15:21:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:54.602 00:05:54.602 real 0m2.673s 00:05:54.602 user 0m2.409s 00:05:54.602 sys 0m0.272s 00:05:54.602 15:21:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.602 15:21:40 -- common/autotest_common.sh@10 -- # set +x 00:05:54.602 ************************************ 00:05:54.602 END TEST accel_copy_crc32c_C2 00:05:54.602 ************************************ 00:05:54.602 15:21:40 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:05:54.602 15:21:40 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:54.602 15:21:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.602 15:21:40 -- common/autotest_common.sh@10 -- # set +x 00:05:54.602 ************************************ 00:05:54.602 START TEST accel_dualcast 00:05:54.602 ************************************ 00:05:54.602 15:21:40 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:05:54.602 15:21:40 -- accel/accel.sh@16 -- # local accel_opc 00:05:54.602 15:21:40 -- accel/accel.sh@17 -- # local accel_module 00:05:54.602 15:21:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:05:54.602 15:21:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:54.602 15:21:40 -- accel/accel.sh@12 -- # build_accel_config 00:05:54.602 15:21:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:54.602 15:21:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:54.602 15:21:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:54.602 15:21:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:54.602 15:21:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:54.602 15:21:40 -- accel/accel.sh@41 -- # local IFS=, 00:05:54.602 15:21:40 -- accel/accel.sh@42 -- # jq -r . 00:05:54.602 [2024-12-02 15:21:40.077375] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:54.602 [2024-12-02 15:21:40.077467] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1586855 ] 00:05:54.602 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.602 [2024-12-02 15:21:40.146790] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.602 [2024-12-02 15:21:40.217175] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.981 15:21:41 -- accel/accel.sh@18 -- # out=' 00:05:55.981 SPDK Configuration: 00:05:55.981 Core mask: 0x1 00:05:55.981 00:05:55.981 Accel Perf Configuration: 00:05:55.981 Workload Type: dualcast 00:05:55.981 Transfer size: 4096 bytes 00:05:55.981 Vector count 1 00:05:55.981 Module: software 00:05:55.981 Queue depth: 32 00:05:55.981 Allocate depth: 32 00:05:55.981 # threads/core: 1 00:05:55.981 Run time: 1 seconds 00:05:55.981 Verify: Yes 00:05:55.981 00:05:55.981 Running for 1 seconds... 00:05:55.981 00:05:55.981 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:55.981 ------------------------------------------------------------------------------------ 00:05:55.981 0,0 606080/s 2367 MiB/s 0 0 00:05:55.981 ==================================================================================== 00:05:55.981 Total 606080/s 2367 MiB/s 0 0' 00:05:55.981 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.981 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.981 15:21:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:05:55.981 15:21:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:55.981 15:21:41 -- accel/accel.sh@12 -- # build_accel_config 00:05:55.981 15:21:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:55.981 15:21:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:55.981 15:21:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:55.981 15:21:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:55.981 15:21:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:55.981 15:21:41 -- accel/accel.sh@41 -- # local IFS=, 00:05:55.981 15:21:41 -- accel/accel.sh@42 -- # jq -r . 00:05:55.981 [2024-12-02 15:21:41.406061] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:55.981 [2024-12-02 15:21:41.406155] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1587129 ] 00:05:55.981 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.981 [2024-12-02 15:21:41.473793] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.981 [2024-12-02 15:21:41.540983] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.981 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.981 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.981 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.981 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.981 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.981 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.981 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.981 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.981 15:21:41 -- accel/accel.sh@21 -- # val=0x1 00:05:55.981 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val=dualcast 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val=software 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@23 -- # accel_module=software 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val=32 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val=32 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val=1 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val=Yes 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:55.982 15:21:41 -- accel/accel.sh@21 -- # val= 00:05:55.982 15:21:41 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # IFS=: 00:05:55.982 15:21:41 -- accel/accel.sh@20 -- # read -r var val 00:05:56.919 15:21:42 -- accel/accel.sh@21 -- # val= 00:05:56.919 15:21:42 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # IFS=: 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # read -r var val 00:05:56.919 15:21:42 -- accel/accel.sh@21 -- # val= 00:05:56.919 15:21:42 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # IFS=: 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # read -r var val 00:05:56.919 15:21:42 -- accel/accel.sh@21 -- # val= 00:05:56.919 15:21:42 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # IFS=: 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # read -r var val 00:05:56.919 15:21:42 -- accel/accel.sh@21 -- # val= 00:05:56.919 15:21:42 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # IFS=: 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # read -r var val 00:05:56.919 15:21:42 -- accel/accel.sh@21 -- # val= 00:05:56.919 15:21:42 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # IFS=: 00:05:56.919 15:21:42 -- accel/accel.sh@20 -- # read -r var val 00:05:56.920 15:21:42 -- accel/accel.sh@21 -- # val= 00:05:56.920 15:21:42 -- accel/accel.sh@22 -- # case "$var" in 00:05:56.920 15:21:42 -- accel/accel.sh@20 -- # IFS=: 00:05:56.920 15:21:42 -- accel/accel.sh@20 -- # read -r var val 00:05:56.920 15:21:42 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:56.920 15:21:42 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:05:56.920 15:21:42 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:56.920 00:05:56.920 real 0m2.659s 00:05:56.920 user 0m2.408s 00:05:56.920 sys 0m0.258s 00:05:56.920 15:21:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:56.920 15:21:42 -- common/autotest_common.sh@10 -- # set +x 00:05:56.920 ************************************ 00:05:56.920 END TEST accel_dualcast 00:05:56.920 ************************************ 00:05:57.179 15:21:42 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:05:57.179 15:21:42 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:57.179 15:21:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.179 15:21:42 -- common/autotest_common.sh@10 -- # set +x 00:05:57.179 ************************************ 00:05:57.179 START TEST accel_compare 00:05:57.179 ************************************ 00:05:57.179 15:21:42 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:05:57.179 15:21:42 -- accel/accel.sh@16 -- # local accel_opc 00:05:57.179 15:21:42 -- accel/accel.sh@17 -- # local accel_module 00:05:57.179 15:21:42 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:05:57.179 15:21:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:57.179 15:21:42 -- accel/accel.sh@12 -- # build_accel_config 00:05:57.179 15:21:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:57.179 15:21:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:57.179 15:21:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:57.179 15:21:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:57.179 15:21:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:57.179 15:21:42 -- accel/accel.sh@41 -- # local IFS=, 00:05:57.179 15:21:42 -- accel/accel.sh@42 -- # jq -r . 00:05:57.179 [2024-12-02 15:21:42.786244] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:57.179 [2024-12-02 15:21:42.786329] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1587410 ] 00:05:57.179 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.179 [2024-12-02 15:21:42.857080] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.179 [2024-12-02 15:21:42.924628] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.559 15:21:44 -- accel/accel.sh@18 -- # out=' 00:05:58.559 SPDK Configuration: 00:05:58.559 Core mask: 0x1 00:05:58.559 00:05:58.559 Accel Perf Configuration: 00:05:58.559 Workload Type: compare 00:05:58.559 Transfer size: 4096 bytes 00:05:58.559 Vector count 1 00:05:58.559 Module: software 00:05:58.559 Queue depth: 32 00:05:58.559 Allocate depth: 32 00:05:58.559 # threads/core: 1 00:05:58.559 Run time: 1 seconds 00:05:58.559 Verify: Yes 00:05:58.559 00:05:58.559 Running for 1 seconds... 00:05:58.559 00:05:58.559 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:58.559 ------------------------------------------------------------------------------------ 00:05:58.559 0,0 792704/s 3096 MiB/s 0 0 00:05:58.559 ==================================================================================== 00:05:58.559 Total 792704/s 3096 MiB/s 0 0' 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:05:58.559 15:21:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:58.559 15:21:44 -- accel/accel.sh@12 -- # build_accel_config 00:05:58.559 15:21:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:58.559 15:21:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:58.559 15:21:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:58.559 15:21:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:58.559 15:21:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:58.559 15:21:44 -- accel/accel.sh@41 -- # local IFS=, 00:05:58.559 15:21:44 -- accel/accel.sh@42 -- # jq -r . 00:05:58.559 [2024-12-02 15:21:44.114669] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:58.559 [2024-12-02 15:21:44.114768] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1587678 ] 00:05:58.559 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.559 [2024-12-02 15:21:44.184514] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.559 [2024-12-02 15:21:44.251200] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val=0x1 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val=compare 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@24 -- # accel_opc=compare 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val=software 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@23 -- # accel_module=software 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.559 15:21:44 -- accel/accel.sh@21 -- # val=32 00:05:58.559 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.559 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.560 15:21:44 -- accel/accel.sh@21 -- # val=32 00:05:58.560 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.560 15:21:44 -- accel/accel.sh@21 -- # val=1 00:05:58.560 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.560 15:21:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:58.560 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.560 15:21:44 -- accel/accel.sh@21 -- # val=Yes 00:05:58.560 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.560 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.560 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:58.560 15:21:44 -- accel/accel.sh@21 -- # val= 00:05:58.560 15:21:44 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # IFS=: 00:05:58.560 15:21:44 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@21 -- # val= 00:05:59.937 15:21:45 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # IFS=: 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@21 -- # val= 00:05:59.937 15:21:45 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # IFS=: 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@21 -- # val= 00:05:59.937 15:21:45 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # IFS=: 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@21 -- # val= 00:05:59.937 15:21:45 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # IFS=: 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@21 -- # val= 00:05:59.937 15:21:45 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # IFS=: 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@21 -- # val= 00:05:59.937 15:21:45 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # IFS=: 00:05:59.937 15:21:45 -- accel/accel.sh@20 -- # read -r var val 00:05:59.937 15:21:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:59.937 15:21:45 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:05:59.937 15:21:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:59.937 00:05:59.937 real 0m2.663s 00:05:59.937 user 0m2.409s 00:05:59.937 sys 0m0.259s 00:05:59.937 15:21:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.937 15:21:45 -- common/autotest_common.sh@10 -- # set +x 00:05:59.937 ************************************ 00:05:59.937 END TEST accel_compare 00:05:59.937 ************************************ 00:05:59.937 15:21:45 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:05:59.937 15:21:45 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:59.937 15:21:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:59.937 15:21:45 -- common/autotest_common.sh@10 -- # set +x 00:05:59.937 ************************************ 00:05:59.937 START TEST accel_xor 00:05:59.937 ************************************ 00:05:59.937 15:21:45 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:05:59.937 15:21:45 -- accel/accel.sh@16 -- # local accel_opc 00:05:59.937 15:21:45 -- accel/accel.sh@17 -- # local accel_module 00:05:59.937 15:21:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:05:59.937 15:21:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:05:59.938 15:21:45 -- accel/accel.sh@12 -- # build_accel_config 00:05:59.938 15:21:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:59.938 15:21:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:59.938 15:21:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:59.938 15:21:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:59.938 15:21:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:59.938 15:21:45 -- accel/accel.sh@41 -- # local IFS=, 00:05:59.938 15:21:45 -- accel/accel.sh@42 -- # jq -r . 00:05:59.938 [2024-12-02 15:21:45.497243] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:59.938 [2024-12-02 15:21:45.497324] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1587972 ] 00:05:59.938 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.938 [2024-12-02 15:21:45.567222] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.938 [2024-12-02 15:21:45.640382] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.407 15:21:46 -- accel/accel.sh@18 -- # out=' 00:06:01.407 SPDK Configuration: 00:06:01.407 Core mask: 0x1 00:06:01.407 00:06:01.407 Accel Perf Configuration: 00:06:01.407 Workload Type: xor 00:06:01.407 Source buffers: 2 00:06:01.407 Transfer size: 4096 bytes 00:06:01.407 Vector count 1 00:06:01.407 Module: software 00:06:01.407 Queue depth: 32 00:06:01.407 Allocate depth: 32 00:06:01.407 # threads/core: 1 00:06:01.407 Run time: 1 seconds 00:06:01.407 Verify: Yes 00:06:01.407 00:06:01.407 Running for 1 seconds... 00:06:01.407 00:06:01.407 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:01.407 ------------------------------------------------------------------------------------ 00:06:01.407 0,0 712224/s 2782 MiB/s 0 0 00:06:01.407 ==================================================================================== 00:06:01.407 Total 712224/s 2782 MiB/s 0 0' 00:06:01.407 15:21:46 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:46 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:46 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:01.407 15:21:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:01.407 15:21:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.407 15:21:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.407 15:21:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.407 15:21:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.407 15:21:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.407 15:21:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.407 15:21:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.407 15:21:46 -- accel/accel.sh@42 -- # jq -r . 00:06:01.407 [2024-12-02 15:21:46.833257] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:01.407 [2024-12-02 15:21:46.833343] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588143 ] 00:06:01.407 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.407 [2024-12-02 15:21:46.904512] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.407 [2024-12-02 15:21:46.977579] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val=0x1 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val=xor 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.407 15:21:47 -- accel/accel.sh@21 -- # val=2 00:06:01.407 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.407 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val=software 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@23 -- # accel_module=software 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val=32 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val=32 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val=1 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val=Yes 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:01.408 15:21:47 -- accel/accel.sh@21 -- # val= 00:06:01.408 15:21:47 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # IFS=: 00:06:01.408 15:21:47 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@21 -- # val= 00:06:02.343 15:21:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # IFS=: 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@21 -- # val= 00:06:02.343 15:21:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # IFS=: 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@21 -- # val= 00:06:02.343 15:21:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # IFS=: 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@21 -- # val= 00:06:02.343 15:21:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # IFS=: 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@21 -- # val= 00:06:02.343 15:21:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # IFS=: 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@21 -- # val= 00:06:02.343 15:21:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # IFS=: 00:06:02.343 15:21:48 -- accel/accel.sh@20 -- # read -r var val 00:06:02.343 15:21:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:02.343 15:21:48 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:02.343 15:21:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:02.343 00:06:02.343 real 0m2.677s 00:06:02.343 user 0m2.421s 00:06:02.343 sys 0m0.266s 00:06:02.343 15:21:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.343 15:21:48 -- common/autotest_common.sh@10 -- # set +x 00:06:02.343 ************************************ 00:06:02.343 END TEST accel_xor 00:06:02.343 ************************************ 00:06:02.602 15:21:48 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:02.602 15:21:48 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:02.602 15:21:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.602 15:21:48 -- common/autotest_common.sh@10 -- # set +x 00:06:02.602 ************************************ 00:06:02.602 START TEST accel_xor 00:06:02.602 ************************************ 00:06:02.602 15:21:48 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:06:02.602 15:21:48 -- accel/accel.sh@16 -- # local accel_opc 00:06:02.602 15:21:48 -- accel/accel.sh@17 -- # local accel_module 00:06:02.602 15:21:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:02.602 15:21:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:02.602 15:21:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:02.602 15:21:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:02.602 15:21:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:02.602 15:21:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:02.602 15:21:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:02.602 15:21:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:02.602 15:21:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:02.602 15:21:48 -- accel/accel.sh@42 -- # jq -r . 00:06:02.602 [2024-12-02 15:21:48.222741] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:02.602 [2024-12-02 15:21:48.222831] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588360 ] 00:06:02.602 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.602 [2024-12-02 15:21:48.291657] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.602 [2024-12-02 15:21:48.358993] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.978 15:21:49 -- accel/accel.sh@18 -- # out=' 00:06:03.978 SPDK Configuration: 00:06:03.978 Core mask: 0x1 00:06:03.978 00:06:03.978 Accel Perf Configuration: 00:06:03.978 Workload Type: xor 00:06:03.978 Source buffers: 3 00:06:03.978 Transfer size: 4096 bytes 00:06:03.978 Vector count 1 00:06:03.978 Module: software 00:06:03.978 Queue depth: 32 00:06:03.978 Allocate depth: 32 00:06:03.978 # threads/core: 1 00:06:03.978 Run time: 1 seconds 00:06:03.978 Verify: Yes 00:06:03.978 00:06:03.978 Running for 1 seconds... 00:06:03.978 00:06:03.978 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:03.978 ------------------------------------------------------------------------------------ 00:06:03.978 0,0 675520/s 2638 MiB/s 0 0 00:06:03.978 ==================================================================================== 00:06:03.978 Total 675520/s 2638 MiB/s 0 0' 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.978 15:21:49 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:03.978 15:21:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:03.978 15:21:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:03.978 15:21:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:03.978 15:21:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.978 15:21:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.978 15:21:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:03.978 15:21:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:03.978 15:21:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:03.978 15:21:49 -- accel/accel.sh@42 -- # jq -r . 00:06:03.978 [2024-12-02 15:21:49.547008] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:03.978 [2024-12-02 15:21:49.547099] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588556 ] 00:06:03.978 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.978 [2024-12-02 15:21:49.617362] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.978 [2024-12-02 15:21:49.684223] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.978 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.978 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.978 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.978 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.978 15:21:49 -- accel/accel.sh@21 -- # val=0x1 00:06:03.978 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.978 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=xor 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=3 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=software 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@23 -- # accel_module=software 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=32 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=32 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=1 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val=Yes 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:03.979 15:21:49 -- accel/accel.sh@21 -- # val= 00:06:03.979 15:21:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # IFS=: 00:06:03.979 15:21:49 -- accel/accel.sh@20 -- # read -r var val 00:06:05.356 15:21:50 -- accel/accel.sh@21 -- # val= 00:06:05.357 15:21:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # IFS=: 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # read -r var val 00:06:05.357 15:21:50 -- accel/accel.sh@21 -- # val= 00:06:05.357 15:21:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # IFS=: 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # read -r var val 00:06:05.357 15:21:50 -- accel/accel.sh@21 -- # val= 00:06:05.357 15:21:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # IFS=: 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # read -r var val 00:06:05.357 15:21:50 -- accel/accel.sh@21 -- # val= 00:06:05.357 15:21:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # IFS=: 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # read -r var val 00:06:05.357 15:21:50 -- accel/accel.sh@21 -- # val= 00:06:05.357 15:21:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # IFS=: 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # read -r var val 00:06:05.357 15:21:50 -- accel/accel.sh@21 -- # val= 00:06:05.357 15:21:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # IFS=: 00:06:05.357 15:21:50 -- accel/accel.sh@20 -- # read -r var val 00:06:05.357 15:21:50 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:05.357 15:21:50 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:05.357 15:21:50 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:05.357 00:06:05.357 real 0m2.658s 00:06:05.357 user 0m2.412s 00:06:05.357 sys 0m0.253s 00:06:05.357 15:21:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:05.357 15:21:50 -- common/autotest_common.sh@10 -- # set +x 00:06:05.357 ************************************ 00:06:05.357 END TEST accel_xor 00:06:05.357 ************************************ 00:06:05.357 15:21:50 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:05.357 15:21:50 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:05.357 15:21:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.357 15:21:50 -- common/autotest_common.sh@10 -- # set +x 00:06:05.357 ************************************ 00:06:05.357 START TEST accel_dif_verify 00:06:05.357 ************************************ 00:06:05.357 15:21:50 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:06:05.357 15:21:50 -- accel/accel.sh@16 -- # local accel_opc 00:06:05.357 15:21:50 -- accel/accel.sh@17 -- # local accel_module 00:06:05.357 15:21:50 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:05.357 15:21:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:05.357 15:21:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.357 15:21:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.357 15:21:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.357 15:21:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.357 15:21:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.357 15:21:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.357 15:21:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.357 15:21:50 -- accel/accel.sh@42 -- # jq -r . 00:06:05.357 [2024-12-02 15:21:50.929101] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:05.357 [2024-12-02 15:21:50.929180] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588840 ] 00:06:05.357 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.357 [2024-12-02 15:21:50.997862] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.357 [2024-12-02 15:21:51.064924] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.735 15:21:52 -- accel/accel.sh@18 -- # out=' 00:06:06.735 SPDK Configuration: 00:06:06.735 Core mask: 0x1 00:06:06.735 00:06:06.735 Accel Perf Configuration: 00:06:06.735 Workload Type: dif_verify 00:06:06.735 Vector size: 4096 bytes 00:06:06.735 Transfer size: 4096 bytes 00:06:06.735 Block size: 512 bytes 00:06:06.735 Metadata size: 8 bytes 00:06:06.735 Vector count 1 00:06:06.735 Module: software 00:06:06.735 Queue depth: 32 00:06:06.735 Allocate depth: 32 00:06:06.735 # threads/core: 1 00:06:06.735 Run time: 1 seconds 00:06:06.735 Verify: No 00:06:06.735 00:06:06.735 Running for 1 seconds... 00:06:06.735 00:06:06.735 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:06.735 ------------------------------------------------------------------------------------ 00:06:06.735 0,0 244640/s 970 MiB/s 0 0 00:06:06.735 ==================================================================================== 00:06:06.735 Total 244640/s 955 MiB/s 0 0' 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:06.735 15:21:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:06.735 15:21:52 -- accel/accel.sh@12 -- # build_accel_config 00:06:06.735 15:21:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:06.735 15:21:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.735 15:21:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.735 15:21:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:06.735 15:21:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:06.735 15:21:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:06.735 15:21:52 -- accel/accel.sh@42 -- # jq -r . 00:06:06.735 [2024-12-02 15:21:52.256180] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:06.735 [2024-12-02 15:21:52.256275] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1589106 ] 00:06:06.735 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.735 [2024-12-02 15:21:52.328003] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.735 [2024-12-02 15:21:52.393865] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val=0x1 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val=dif_verify 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.735 15:21:52 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:06.735 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.735 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val=software 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@23 -- # accel_module=software 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val=32 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val=32 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val=1 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val=No 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:06.736 15:21:52 -- accel/accel.sh@21 -- # val= 00:06:06.736 15:21:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # IFS=: 00:06:06.736 15:21:52 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@21 -- # val= 00:06:08.114 15:21:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # IFS=: 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@21 -- # val= 00:06:08.114 15:21:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # IFS=: 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@21 -- # val= 00:06:08.114 15:21:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # IFS=: 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@21 -- # val= 00:06:08.114 15:21:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # IFS=: 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@21 -- # val= 00:06:08.114 15:21:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # IFS=: 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@21 -- # val= 00:06:08.114 15:21:53 -- accel/accel.sh@22 -- # case "$var" in 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # IFS=: 00:06:08.114 15:21:53 -- accel/accel.sh@20 -- # read -r var val 00:06:08.114 15:21:53 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:08.114 15:21:53 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:08.114 15:21:53 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:08.114 00:06:08.114 real 0m2.661s 00:06:08.114 user 0m2.402s 00:06:08.114 sys 0m0.266s 00:06:08.114 15:21:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:08.114 15:21:53 -- common/autotest_common.sh@10 -- # set +x 00:06:08.114 ************************************ 00:06:08.114 END TEST accel_dif_verify 00:06:08.114 ************************************ 00:06:08.114 15:21:53 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:08.114 15:21:53 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:08.114 15:21:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:08.114 15:21:53 -- common/autotest_common.sh@10 -- # set +x 00:06:08.114 ************************************ 00:06:08.114 START TEST accel_dif_generate 00:06:08.114 ************************************ 00:06:08.114 15:21:53 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:06:08.114 15:21:53 -- accel/accel.sh@16 -- # local accel_opc 00:06:08.114 15:21:53 -- accel/accel.sh@17 -- # local accel_module 00:06:08.114 15:21:53 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:08.114 15:21:53 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:08.114 15:21:53 -- accel/accel.sh@12 -- # build_accel_config 00:06:08.114 15:21:53 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:08.114 15:21:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:08.114 15:21:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:08.114 15:21:53 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:08.114 15:21:53 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:08.114 15:21:53 -- accel/accel.sh@41 -- # local IFS=, 00:06:08.114 15:21:53 -- accel/accel.sh@42 -- # jq -r . 00:06:08.114 [2024-12-02 15:21:53.640642] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:08.114 [2024-12-02 15:21:53.640754] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1589400 ] 00:06:08.114 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.114 [2024-12-02 15:21:53.712116] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.115 [2024-12-02 15:21:53.779177] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.491 15:21:54 -- accel/accel.sh@18 -- # out=' 00:06:09.491 SPDK Configuration: 00:06:09.491 Core mask: 0x1 00:06:09.491 00:06:09.491 Accel Perf Configuration: 00:06:09.491 Workload Type: dif_generate 00:06:09.491 Vector size: 4096 bytes 00:06:09.491 Transfer size: 4096 bytes 00:06:09.491 Block size: 512 bytes 00:06:09.491 Metadata size: 8 bytes 00:06:09.491 Vector count 1 00:06:09.491 Module: software 00:06:09.491 Queue depth: 32 00:06:09.491 Allocate depth: 32 00:06:09.491 # threads/core: 1 00:06:09.491 Run time: 1 seconds 00:06:09.491 Verify: No 00:06:09.491 00:06:09.491 Running for 1 seconds... 00:06:09.491 00:06:09.491 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:09.491 ------------------------------------------------------------------------------------ 00:06:09.491 0,0 287168/s 1139 MiB/s 0 0 00:06:09.491 ==================================================================================== 00:06:09.491 Total 287168/s 1121 MiB/s 0 0' 00:06:09.491 15:21:54 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:54 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:54 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:09.491 15:21:54 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:09.491 15:21:54 -- accel/accel.sh@12 -- # build_accel_config 00:06:09.491 15:21:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:09.491 15:21:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.491 15:21:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.491 15:21:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:09.491 15:21:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:09.491 15:21:54 -- accel/accel.sh@41 -- # local IFS=, 00:06:09.491 15:21:54 -- accel/accel.sh@42 -- # jq -r . 00:06:09.491 [2024-12-02 15:21:54.968967] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:09.491 [2024-12-02 15:21:54.969059] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1589666 ] 00:06:09.491 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.491 [2024-12-02 15:21:55.039217] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.491 [2024-12-02 15:21:55.104615] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val=0x1 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val=dif_generate 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.491 15:21:55 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:09.491 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.491 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val=software 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@23 -- # accel_module=software 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val=32 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val=32 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val=1 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val=No 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:09.492 15:21:55 -- accel/accel.sh@21 -- # val= 00:06:09.492 15:21:55 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # IFS=: 00:06:09.492 15:21:55 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@21 -- # val= 00:06:10.870 15:21:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # IFS=: 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@21 -- # val= 00:06:10.870 15:21:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # IFS=: 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@21 -- # val= 00:06:10.870 15:21:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # IFS=: 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@21 -- # val= 00:06:10.870 15:21:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # IFS=: 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@21 -- # val= 00:06:10.870 15:21:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # IFS=: 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@21 -- # val= 00:06:10.870 15:21:56 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # IFS=: 00:06:10.870 15:21:56 -- accel/accel.sh@20 -- # read -r var val 00:06:10.870 15:21:56 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:10.870 15:21:56 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:10.870 15:21:56 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:10.870 00:06:10.870 real 0m2.662s 00:06:10.870 user 0m2.409s 00:06:10.870 sys 0m0.261s 00:06:10.870 15:21:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:10.870 15:21:56 -- common/autotest_common.sh@10 -- # set +x 00:06:10.870 ************************************ 00:06:10.870 END TEST accel_dif_generate 00:06:10.870 ************************************ 00:06:10.870 15:21:56 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:10.870 15:21:56 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:10.870 15:21:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.870 15:21:56 -- common/autotest_common.sh@10 -- # set +x 00:06:10.870 ************************************ 00:06:10.870 START TEST accel_dif_generate_copy 00:06:10.870 ************************************ 00:06:10.870 15:21:56 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:06:10.870 15:21:56 -- accel/accel.sh@16 -- # local accel_opc 00:06:10.870 15:21:56 -- accel/accel.sh@17 -- # local accel_module 00:06:10.870 15:21:56 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:10.870 15:21:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:10.870 15:21:56 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.870 15:21:56 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.870 15:21:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.870 15:21:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.870 15:21:56 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.870 15:21:56 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.870 15:21:56 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.870 15:21:56 -- accel/accel.sh@42 -- # jq -r . 00:06:10.870 [2024-12-02 15:21:56.350558] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:10.870 [2024-12-02 15:21:56.350641] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1589949 ] 00:06:10.870 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.870 [2024-12-02 15:21:56.420334] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.870 [2024-12-02 15:21:56.487535] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.250 15:21:57 -- accel/accel.sh@18 -- # out=' 00:06:12.250 SPDK Configuration: 00:06:12.250 Core mask: 0x1 00:06:12.250 00:06:12.250 Accel Perf Configuration: 00:06:12.250 Workload Type: dif_generate_copy 00:06:12.250 Vector size: 4096 bytes 00:06:12.250 Transfer size: 4096 bytes 00:06:12.250 Vector count 1 00:06:12.250 Module: software 00:06:12.250 Queue depth: 32 00:06:12.250 Allocate depth: 32 00:06:12.250 # threads/core: 1 00:06:12.250 Run time: 1 seconds 00:06:12.250 Verify: No 00:06:12.250 00:06:12.250 Running for 1 seconds... 00:06:12.250 00:06:12.250 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:12.250 ------------------------------------------------------------------------------------ 00:06:12.250 0,0 225408/s 894 MiB/s 0 0 00:06:12.250 ==================================================================================== 00:06:12.250 Total 225408/s 880 MiB/s 0 0' 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.250 15:21:57 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:12.250 15:21:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:12.250 15:21:57 -- accel/accel.sh@12 -- # build_accel_config 00:06:12.250 15:21:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:12.250 15:21:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:12.250 15:21:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:12.250 15:21:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:12.250 15:21:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:12.250 15:21:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:12.250 15:21:57 -- accel/accel.sh@42 -- # jq -r . 00:06:12.250 [2024-12-02 15:21:57.678450] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:12.250 [2024-12-02 15:21:57.678540] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1590135 ] 00:06:12.250 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.250 [2024-12-02 15:21:57.748497] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.250 [2024-12-02 15:21:57.818273] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.250 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.250 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.250 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.250 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.250 15:21:57 -- accel/accel.sh@21 -- # val=0x1 00:06:12.250 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.250 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val=software 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@23 -- # accel_module=software 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val=32 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val=32 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val=1 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val=No 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:12.251 15:21:57 -- accel/accel.sh@21 -- # val= 00:06:12.251 15:21:57 -- accel/accel.sh@22 -- # case "$var" in 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # IFS=: 00:06:12.251 15:21:57 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@21 -- # val= 00:06:13.190 15:21:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # IFS=: 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@21 -- # val= 00:06:13.190 15:21:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # IFS=: 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@21 -- # val= 00:06:13.190 15:21:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # IFS=: 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@21 -- # val= 00:06:13.190 15:21:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # IFS=: 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@21 -- # val= 00:06:13.190 15:21:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # IFS=: 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@21 -- # val= 00:06:13.190 15:21:58 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # IFS=: 00:06:13.190 15:21:58 -- accel/accel.sh@20 -- # read -r var val 00:06:13.190 15:21:58 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:13.190 15:21:58 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:13.190 15:21:58 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:13.190 00:06:13.190 real 0m2.666s 00:06:13.190 user 0m2.416s 00:06:13.190 sys 0m0.259s 00:06:13.190 15:21:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:13.190 15:21:58 -- common/autotest_common.sh@10 -- # set +x 00:06:13.190 ************************************ 00:06:13.190 END TEST accel_dif_generate_copy 00:06:13.190 ************************************ 00:06:13.450 15:21:59 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:13.450 15:21:59 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.450 15:21:59 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:13.450 15:21:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:13.450 15:21:59 -- common/autotest_common.sh@10 -- # set +x 00:06:13.450 ************************************ 00:06:13.450 START TEST accel_comp 00:06:13.450 ************************************ 00:06:13.450 15:21:59 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.450 15:21:59 -- accel/accel.sh@16 -- # local accel_opc 00:06:13.450 15:21:59 -- accel/accel.sh@17 -- # local accel_module 00:06:13.450 15:21:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.450 15:21:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:13.450 15:21:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.450 15:21:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:13.450 15:21:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.450 15:21:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.450 15:21:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:13.450 15:21:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:13.450 15:21:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:13.450 15:21:59 -- accel/accel.sh@42 -- # jq -r . 00:06:13.450 [2024-12-02 15:21:59.065042] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:13.450 [2024-12-02 15:21:59.065132] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1590348 ] 00:06:13.450 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.450 [2024-12-02 15:21:59.135826] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.450 [2024-12-02 15:21:59.203683] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.832 15:22:00 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:14.832 00:06:14.832 SPDK Configuration: 00:06:14.832 Core mask: 0x1 00:06:14.832 00:06:14.832 Accel Perf Configuration: 00:06:14.832 Workload Type: compress 00:06:14.832 Transfer size: 4096 bytes 00:06:14.832 Vector count 1 00:06:14.832 Module: software 00:06:14.832 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:14.832 Queue depth: 32 00:06:14.832 Allocate depth: 32 00:06:14.832 # threads/core: 1 00:06:14.832 Run time: 1 seconds 00:06:14.832 Verify: No 00:06:14.832 00:06:14.832 Running for 1 seconds... 00:06:14.832 00:06:14.832 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:14.832 ------------------------------------------------------------------------------------ 00:06:14.832 0,0 67552/s 281 MiB/s 0 0 00:06:14.832 ==================================================================================== 00:06:14.832 Total 67552/s 263 MiB/s 0 0' 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:14.832 15:22:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:14.832 15:22:00 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.832 15:22:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.832 15:22:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.832 15:22:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.832 15:22:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.832 15:22:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.832 15:22:00 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.832 15:22:00 -- accel/accel.sh@42 -- # jq -r . 00:06:14.832 [2024-12-02 15:22:00.398425] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.832 [2024-12-02 15:22:00.398517] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1590528 ] 00:06:14.832 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.832 [2024-12-02 15:22:00.470808] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.832 [2024-12-02 15:22:00.538510] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=0x1 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=compress 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=software 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@23 -- # accel_module=software 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=32 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=32 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=1 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val=No 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:14.832 15:22:00 -- accel/accel.sh@21 -- # val= 00:06:14.832 15:22:00 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # IFS=: 00:06:14.832 15:22:00 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@21 -- # val= 00:06:16.213 15:22:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # IFS=: 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@21 -- # val= 00:06:16.213 15:22:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # IFS=: 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@21 -- # val= 00:06:16.213 15:22:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # IFS=: 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@21 -- # val= 00:06:16.213 15:22:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # IFS=: 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@21 -- # val= 00:06:16.213 15:22:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # IFS=: 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@21 -- # val= 00:06:16.213 15:22:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # IFS=: 00:06:16.213 15:22:01 -- accel/accel.sh@20 -- # read -r var val 00:06:16.213 15:22:01 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:16.213 15:22:01 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:16.213 15:22:01 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.213 00:06:16.213 real 0m2.673s 00:06:16.213 user 0m2.428s 00:06:16.213 sys 0m0.253s 00:06:16.213 15:22:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:16.213 15:22:01 -- common/autotest_common.sh@10 -- # set +x 00:06:16.213 ************************************ 00:06:16.213 END TEST accel_comp 00:06:16.213 ************************************ 00:06:16.213 15:22:01 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:16.213 15:22:01 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:16.213 15:22:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.213 15:22:01 -- common/autotest_common.sh@10 -- # set +x 00:06:16.213 ************************************ 00:06:16.213 START TEST accel_decomp 00:06:16.213 ************************************ 00:06:16.213 15:22:01 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:16.213 15:22:01 -- accel/accel.sh@16 -- # local accel_opc 00:06:16.213 15:22:01 -- accel/accel.sh@17 -- # local accel_module 00:06:16.213 15:22:01 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:16.213 15:22:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:16.213 15:22:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.213 15:22:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.213 15:22:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.213 15:22:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.213 15:22:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.213 15:22:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.213 15:22:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.213 15:22:01 -- accel/accel.sh@42 -- # jq -r . 00:06:16.213 [2024-12-02 15:22:01.785277] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.213 [2024-12-02 15:22:01.785378] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1590811 ] 00:06:16.213 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.213 [2024-12-02 15:22:01.855053] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.213 [2024-12-02 15:22:01.923677] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.592 15:22:03 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:17.592 00:06:17.592 SPDK Configuration: 00:06:17.592 Core mask: 0x1 00:06:17.592 00:06:17.592 Accel Perf Configuration: 00:06:17.592 Workload Type: decompress 00:06:17.592 Transfer size: 4096 bytes 00:06:17.592 Vector count 1 00:06:17.592 Module: software 00:06:17.592 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:17.592 Queue depth: 32 00:06:17.592 Allocate depth: 32 00:06:17.592 # threads/core: 1 00:06:17.592 Run time: 1 seconds 00:06:17.592 Verify: Yes 00:06:17.592 00:06:17.592 Running for 1 seconds... 00:06:17.592 00:06:17.592 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:17.592 ------------------------------------------------------------------------------------ 00:06:17.592 0,0 88608/s 163 MiB/s 0 0 00:06:17.592 ==================================================================================== 00:06:17.592 Total 88608/s 346 MiB/s 0 0' 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:17.592 15:22:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:17.592 15:22:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:17.592 15:22:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:17.592 15:22:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:17.592 15:22:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:17.592 15:22:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:17.592 15:22:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:17.592 15:22:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:17.592 15:22:03 -- accel/accel.sh@42 -- # jq -r . 00:06:17.592 [2024-12-02 15:22:03.117374] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:17.592 [2024-12-02 15:22:03.117463] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591083 ] 00:06:17.592 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.592 [2024-12-02 15:22:03.187942] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.592 [2024-12-02 15:22:03.254069] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val=0x1 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val=decompress 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val=software 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@23 -- # accel_module=software 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:17.592 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.592 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.592 15:22:03 -- accel/accel.sh@21 -- # val=32 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.593 15:22:03 -- accel/accel.sh@21 -- # val=32 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.593 15:22:03 -- accel/accel.sh@21 -- # val=1 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.593 15:22:03 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.593 15:22:03 -- accel/accel.sh@21 -- # val=Yes 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.593 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:17.593 15:22:03 -- accel/accel.sh@21 -- # val= 00:06:17.593 15:22:03 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # IFS=: 00:06:17.593 15:22:03 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@21 -- # val= 00:06:18.972 15:22:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # IFS=: 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@21 -- # val= 00:06:18.972 15:22:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # IFS=: 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@21 -- # val= 00:06:18.972 15:22:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # IFS=: 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@21 -- # val= 00:06:18.972 15:22:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # IFS=: 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@21 -- # val= 00:06:18.972 15:22:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # IFS=: 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@21 -- # val= 00:06:18.972 15:22:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # IFS=: 00:06:18.972 15:22:04 -- accel/accel.sh@20 -- # read -r var val 00:06:18.972 15:22:04 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:18.972 15:22:04 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:18.972 15:22:04 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:18.972 00:06:18.972 real 0m2.662s 00:06:18.972 user 0m2.418s 00:06:18.972 sys 0m0.252s 00:06:18.973 15:22:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.973 15:22:04 -- common/autotest_common.sh@10 -- # set +x 00:06:18.973 ************************************ 00:06:18.973 END TEST accel_decomp 00:06:18.973 ************************************ 00:06:18.973 15:22:04 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:18.973 15:22:04 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:18.973 15:22:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.973 15:22:04 -- common/autotest_common.sh@10 -- # set +x 00:06:18.973 ************************************ 00:06:18.973 START TEST accel_decmop_full 00:06:18.973 ************************************ 00:06:18.973 15:22:04 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:18.973 15:22:04 -- accel/accel.sh@16 -- # local accel_opc 00:06:18.973 15:22:04 -- accel/accel.sh@17 -- # local accel_module 00:06:18.973 15:22:04 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:18.973 15:22:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:18.973 15:22:04 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.973 15:22:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.973 15:22:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.973 15:22:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.973 15:22:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.973 15:22:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.973 15:22:04 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.973 15:22:04 -- accel/accel.sh@42 -- # jq -r . 00:06:18.973 [2024-12-02 15:22:04.496762] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:18.973 [2024-12-02 15:22:04.496852] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591364 ] 00:06:18.973 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.973 [2024-12-02 15:22:04.567269] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:18.973 [2024-12-02 15:22:04.633638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.349 15:22:05 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:20.349 00:06:20.349 SPDK Configuration: 00:06:20.349 Core mask: 0x1 00:06:20.349 00:06:20.349 Accel Perf Configuration: 00:06:20.349 Workload Type: decompress 00:06:20.349 Transfer size: 111250 bytes 00:06:20.349 Vector count 1 00:06:20.349 Module: software 00:06:20.349 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:20.349 Queue depth: 32 00:06:20.349 Allocate depth: 32 00:06:20.349 # threads/core: 1 00:06:20.349 Run time: 1 seconds 00:06:20.349 Verify: Yes 00:06:20.349 00:06:20.349 Running for 1 seconds... 00:06:20.349 00:06:20.349 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:20.349 ------------------------------------------------------------------------------------ 00:06:20.349 0,0 5824/s 240 MiB/s 0 0 00:06:20.349 ==================================================================================== 00:06:20.349 Total 5824/s 617 MiB/s 0 0' 00:06:20.349 15:22:05 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:05 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:05 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:20.349 15:22:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:20.349 15:22:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.349 15:22:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.349 15:22:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.349 15:22:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.349 15:22:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.349 15:22:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.349 15:22:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.349 15:22:05 -- accel/accel.sh@42 -- # jq -r . 00:06:20.349 [2024-12-02 15:22:05.831730] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:20.349 [2024-12-02 15:22:05.831833] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591632 ] 00:06:20.349 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.349 [2024-12-02 15:22:05.900959] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.349 [2024-12-02 15:22:05.969318] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=0x1 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=decompress 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=software 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=32 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=32 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=1 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val=Yes 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.349 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.349 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:20.349 15:22:06 -- accel/accel.sh@21 -- # val= 00:06:20.350 15:22:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.350 15:22:06 -- accel/accel.sh@20 -- # IFS=: 00:06:20.350 15:22:06 -- accel/accel.sh@20 -- # read -r var val 00:06:21.726 15:22:07 -- accel/accel.sh@21 -- # val= 00:06:21.726 15:22:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # IFS=: 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # read -r var val 00:06:21.726 15:22:07 -- accel/accel.sh@21 -- # val= 00:06:21.726 15:22:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # IFS=: 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # read -r var val 00:06:21.726 15:22:07 -- accel/accel.sh@21 -- # val= 00:06:21.726 15:22:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # IFS=: 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # read -r var val 00:06:21.726 15:22:07 -- accel/accel.sh@21 -- # val= 00:06:21.726 15:22:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.726 15:22:07 -- accel/accel.sh@20 -- # IFS=: 00:06:21.727 15:22:07 -- accel/accel.sh@20 -- # read -r var val 00:06:21.727 15:22:07 -- accel/accel.sh@21 -- # val= 00:06:21.727 15:22:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.727 15:22:07 -- accel/accel.sh@20 -- # IFS=: 00:06:21.727 15:22:07 -- accel/accel.sh@20 -- # read -r var val 00:06:21.727 15:22:07 -- accel/accel.sh@21 -- # val= 00:06:21.727 15:22:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.727 15:22:07 -- accel/accel.sh@20 -- # IFS=: 00:06:21.727 15:22:07 -- accel/accel.sh@20 -- # read -r var val 00:06:21.727 15:22:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:21.727 15:22:07 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:21.727 15:22:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:21.727 00:06:21.727 real 0m2.684s 00:06:21.727 user 0m2.427s 00:06:21.727 sys 0m0.264s 00:06:21.727 15:22:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:21.727 15:22:07 -- common/autotest_common.sh@10 -- # set +x 00:06:21.727 ************************************ 00:06:21.727 END TEST accel_decmop_full 00:06:21.727 ************************************ 00:06:21.727 15:22:07 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:21.727 15:22:07 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:21.727 15:22:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.727 15:22:07 -- common/autotest_common.sh@10 -- # set +x 00:06:21.727 ************************************ 00:06:21.727 START TEST accel_decomp_mcore 00:06:21.727 ************************************ 00:06:21.727 15:22:07 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:21.727 15:22:07 -- accel/accel.sh@16 -- # local accel_opc 00:06:21.727 15:22:07 -- accel/accel.sh@17 -- # local accel_module 00:06:21.727 15:22:07 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:21.727 15:22:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:21.727 15:22:07 -- accel/accel.sh@12 -- # build_accel_config 00:06:21.727 15:22:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:21.727 15:22:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.727 15:22:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.727 15:22:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:21.727 15:22:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:21.727 15:22:07 -- accel/accel.sh@41 -- # local IFS=, 00:06:21.727 15:22:07 -- accel/accel.sh@42 -- # jq -r . 00:06:21.727 [2024-12-02 15:22:07.228157] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:21.727 [2024-12-02 15:22:07.228245] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591928 ] 00:06:21.727 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.727 [2024-12-02 15:22:07.297519] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:21.727 [2024-12-02 15:22:07.367029] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.727 [2024-12-02 15:22:07.367122] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:21.727 [2024-12-02 15:22:07.367226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:21.727 [2024-12-02 15:22:07.367228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.104 15:22:08 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:23.104 00:06:23.104 SPDK Configuration: 00:06:23.104 Core mask: 0xf 00:06:23.104 00:06:23.104 Accel Perf Configuration: 00:06:23.104 Workload Type: decompress 00:06:23.104 Transfer size: 4096 bytes 00:06:23.104 Vector count 1 00:06:23.104 Module: software 00:06:23.104 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:23.104 Queue depth: 32 00:06:23.104 Allocate depth: 32 00:06:23.104 # threads/core: 1 00:06:23.104 Run time: 1 seconds 00:06:23.104 Verify: Yes 00:06:23.104 00:06:23.104 Running for 1 seconds... 00:06:23.104 00:06:23.104 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:23.104 ------------------------------------------------------------------------------------ 00:06:23.104 0,0 76128/s 140 MiB/s 0 0 00:06:23.104 3,0 75968/s 140 MiB/s 0 0 00:06:23.104 2,0 76000/s 140 MiB/s 0 0 00:06:23.104 1,0 76288/s 140 MiB/s 0 0 00:06:23.104 ==================================================================================== 00:06:23.104 Total 304384/s 1189 MiB/s 0 0' 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:23.104 15:22:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:23.104 15:22:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.104 15:22:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.104 15:22:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.104 15:22:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.104 15:22:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.104 15:22:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.104 15:22:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.104 15:22:08 -- accel/accel.sh@42 -- # jq -r . 00:06:23.104 [2024-12-02 15:22:08.566775] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:23.104 [2024-12-02 15:22:08.566859] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1592164 ] 00:06:23.104 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.104 [2024-12-02 15:22:08.636427] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:23.104 [2024-12-02 15:22:08.706535] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.104 [2024-12-02 15:22:08.706631] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:23.104 [2024-12-02 15:22:08.706716] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:23.104 [2024-12-02 15:22:08.706718] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=0xf 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=decompress 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=software 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@23 -- # accel_module=software 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=32 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=32 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=1 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val=Yes 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:23.104 15:22:08 -- accel/accel.sh@21 -- # val= 00:06:23.104 15:22:08 -- accel/accel.sh@22 -- # case "$var" in 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # IFS=: 00:06:23.104 15:22:08 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@21 -- # val= 00:06:24.482 15:22:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # IFS=: 00:06:24.482 15:22:09 -- accel/accel.sh@20 -- # read -r var val 00:06:24.482 15:22:09 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:24.482 15:22:09 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:24.482 15:22:09 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:24.482 00:06:24.482 real 0m2.689s 00:06:24.482 user 0m9.078s 00:06:24.482 sys 0m0.275s 00:06:24.482 15:22:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.482 15:22:09 -- common/autotest_common.sh@10 -- # set +x 00:06:24.482 ************************************ 00:06:24.482 END TEST accel_decomp_mcore 00:06:24.482 ************************************ 00:06:24.482 15:22:09 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:24.482 15:22:09 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:24.482 15:22:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.482 15:22:09 -- common/autotest_common.sh@10 -- # set +x 00:06:24.482 ************************************ 00:06:24.482 START TEST accel_decomp_full_mcore 00:06:24.482 ************************************ 00:06:24.482 15:22:09 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:24.482 15:22:09 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.482 15:22:09 -- accel/accel.sh@17 -- # local accel_module 00:06:24.482 15:22:09 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:24.482 15:22:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:24.482 15:22:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.482 15:22:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.483 15:22:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.483 15:22:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.483 15:22:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.483 15:22:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.483 15:22:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.483 15:22:09 -- accel/accel.sh@42 -- # jq -r . 00:06:24.483 [2024-12-02 15:22:09.964669] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.483 [2024-12-02 15:22:09.964758] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1592376 ] 00:06:24.483 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.483 [2024-12-02 15:22:10.035297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:24.483 [2024-12-02 15:22:10.117729] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:24.483 [2024-12-02 15:22:10.117825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:24.483 [2024-12-02 15:22:10.117843] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:24.483 [2024-12-02 15:22:10.117846] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.859 15:22:11 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:25.859 00:06:25.859 SPDK Configuration: 00:06:25.859 Core mask: 0xf 00:06:25.859 00:06:25.859 Accel Perf Configuration: 00:06:25.859 Workload Type: decompress 00:06:25.859 Transfer size: 111250 bytes 00:06:25.859 Vector count 1 00:06:25.859 Module: software 00:06:25.859 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:25.859 Queue depth: 32 00:06:25.859 Allocate depth: 32 00:06:25.859 # threads/core: 1 00:06:25.859 Run time: 1 seconds 00:06:25.859 Verify: Yes 00:06:25.859 00:06:25.859 Running for 1 seconds... 00:06:25.859 00:06:25.859 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:25.859 ------------------------------------------------------------------------------------ 00:06:25.859 0,0 5792/s 239 MiB/s 0 0 00:06:25.859 3,0 5792/s 239 MiB/s 0 0 00:06:25.859 2,0 5824/s 240 MiB/s 0 0 00:06:25.859 1,0 5824/s 240 MiB/s 0 0 00:06:25.859 ==================================================================================== 00:06:25.859 Total 23232/s 2464 MiB/s 0 0' 00:06:25.859 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.859 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.859 15:22:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:25.859 15:22:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:25.859 15:22:11 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.860 15:22:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:25.860 15:22:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.860 15:22:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.860 15:22:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:25.860 15:22:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:25.860 15:22:11 -- accel/accel.sh@41 -- # local IFS=, 00:06:25.860 15:22:11 -- accel/accel.sh@42 -- # jq -r . 00:06:25.860 [2024-12-02 15:22:11.325865] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:25.860 [2024-12-02 15:22:11.325943] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1592568 ] 00:06:25.860 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.860 [2024-12-02 15:22:11.394637] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:25.860 [2024-12-02 15:22:11.464273] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:25.860 [2024-12-02 15:22:11.464371] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:25.860 [2024-12-02 15:22:11.464454] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:25.860 [2024-12-02 15:22:11.464456] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=0xf 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=decompress 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=software 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@23 -- # accel_module=software 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=32 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=32 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=1 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val=Yes 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:25.860 15:22:11 -- accel/accel.sh@21 -- # val= 00:06:25.860 15:22:11 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # IFS=: 00:06:25.860 15:22:11 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@21 -- # val= 00:06:27.240 15:22:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # IFS=: 00:06:27.240 15:22:12 -- accel/accel.sh@20 -- # read -r var val 00:06:27.240 15:22:12 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:27.240 15:22:12 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:27.240 15:22:12 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:27.240 00:06:27.240 real 0m2.717s 00:06:27.240 user 0m9.134s 00:06:27.240 sys 0m0.288s 00:06:27.240 15:22:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:27.240 15:22:12 -- common/autotest_common.sh@10 -- # set +x 00:06:27.240 ************************************ 00:06:27.240 END TEST accel_decomp_full_mcore 00:06:27.240 ************************************ 00:06:27.240 15:22:12 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:27.240 15:22:12 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:27.240 15:22:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.240 15:22:12 -- common/autotest_common.sh@10 -- # set +x 00:06:27.240 ************************************ 00:06:27.240 START TEST accel_decomp_mthread 00:06:27.240 ************************************ 00:06:27.240 15:22:12 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:27.240 15:22:12 -- accel/accel.sh@16 -- # local accel_opc 00:06:27.240 15:22:12 -- accel/accel.sh@17 -- # local accel_module 00:06:27.240 15:22:12 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:27.241 15:22:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:27.241 15:22:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.241 15:22:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:27.241 15:22:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.241 15:22:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.241 15:22:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:27.241 15:22:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:27.241 15:22:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:27.241 15:22:12 -- accel/accel.sh@42 -- # jq -r . 00:06:27.241 [2024-12-02 15:22:12.731092] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:27.241 [2024-12-02 15:22:12.731168] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1592802 ] 00:06:27.241 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.241 [2024-12-02 15:22:12.801216] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.241 [2024-12-02 15:22:12.871051] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.619 15:22:14 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:28.619 00:06:28.619 SPDK Configuration: 00:06:28.619 Core mask: 0x1 00:06:28.619 00:06:28.619 Accel Perf Configuration: 00:06:28.619 Workload Type: decompress 00:06:28.619 Transfer size: 4096 bytes 00:06:28.619 Vector count 1 00:06:28.619 Module: software 00:06:28.619 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:28.619 Queue depth: 32 00:06:28.619 Allocate depth: 32 00:06:28.619 # threads/core: 2 00:06:28.619 Run time: 1 seconds 00:06:28.619 Verify: Yes 00:06:28.619 00:06:28.619 Running for 1 seconds... 00:06:28.619 00:06:28.619 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:28.619 ------------------------------------------------------------------------------------ 00:06:28.619 0,1 46176/s 85 MiB/s 0 0 00:06:28.619 0,0 46048/s 84 MiB/s 0 0 00:06:28.619 ==================================================================================== 00:06:28.619 Total 92224/s 360 MiB/s 0 0' 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:28.619 15:22:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:28.619 15:22:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.619 15:22:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.619 15:22:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.619 15:22:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.619 15:22:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.619 15:22:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.619 15:22:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.619 15:22:14 -- accel/accel.sh@42 -- # jq -r . 00:06:28.619 [2024-12-02 15:22:14.065670] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:28.619 [2024-12-02 15:22:14.065753] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593070 ] 00:06:28.619 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.619 [2024-12-02 15:22:14.134554] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.619 [2024-12-02 15:22:14.202630] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=0x1 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=decompress 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=software 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@23 -- # accel_module=software 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=32 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=32 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=2 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val=Yes 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:28.619 15:22:14 -- accel/accel.sh@21 -- # val= 00:06:28.619 15:22:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # IFS=: 00:06:28.619 15:22:14 -- accel/accel.sh@20 -- # read -r var val 00:06:29.993 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.993 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.993 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.993 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.993 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.994 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.994 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.994 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.994 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.994 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.994 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.994 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.994 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.994 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.994 15:22:15 -- accel/accel.sh@21 -- # val= 00:06:29.994 15:22:15 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # IFS=: 00:06:29.994 15:22:15 -- accel/accel.sh@20 -- # read -r var val 00:06:29.994 15:22:15 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:29.994 15:22:15 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:29.994 15:22:15 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:29.994 00:06:29.994 real 0m2.674s 00:06:29.994 user 0m2.421s 00:06:29.994 sys 0m0.261s 00:06:29.994 15:22:15 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.994 15:22:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.994 ************************************ 00:06:29.994 END TEST accel_decomp_mthread 00:06:29.994 ************************************ 00:06:29.994 15:22:15 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:29.994 15:22:15 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:29.994 15:22:15 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.994 15:22:15 -- common/autotest_common.sh@10 -- # set +x 00:06:29.994 ************************************ 00:06:29.994 START TEST accel_deomp_full_mthread 00:06:29.994 ************************************ 00:06:29.994 15:22:15 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:29.994 15:22:15 -- accel/accel.sh@16 -- # local accel_opc 00:06:29.994 15:22:15 -- accel/accel.sh@17 -- # local accel_module 00:06:29.994 15:22:15 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:29.994 15:22:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:29.994 15:22:15 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.994 15:22:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.994 15:22:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.994 15:22:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.994 15:22:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.994 15:22:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.994 15:22:15 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.994 15:22:15 -- accel/accel.sh@42 -- # jq -r . 00:06:29.994 [2024-12-02 15:22:15.453624] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.994 [2024-12-02 15:22:15.453714] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593360 ] 00:06:29.994 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.994 [2024-12-02 15:22:15.524783] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.994 [2024-12-02 15:22:15.592109] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.368 15:22:16 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:31.368 00:06:31.368 SPDK Configuration: 00:06:31.368 Core mask: 0x1 00:06:31.368 00:06:31.368 Accel Perf Configuration: 00:06:31.368 Workload Type: decompress 00:06:31.368 Transfer size: 111250 bytes 00:06:31.368 Vector count 1 00:06:31.368 Module: software 00:06:31.368 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:31.368 Queue depth: 32 00:06:31.368 Allocate depth: 32 00:06:31.368 # threads/core: 2 00:06:31.368 Run time: 1 seconds 00:06:31.368 Verify: Yes 00:06:31.368 00:06:31.368 Running for 1 seconds... 00:06:31.368 00:06:31.368 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:31.368 ------------------------------------------------------------------------------------ 00:06:31.368 0,1 2944/s 121 MiB/s 0 0 00:06:31.368 0,0 2912/s 120 MiB/s 0 0 00:06:31.368 ==================================================================================== 00:06:31.368 Total 5856/s 621 MiB/s 0 0' 00:06:31.368 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.368 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.368 15:22:16 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:31.368 15:22:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:31.368 15:22:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.368 15:22:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.368 15:22:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.368 15:22:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.368 15:22:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.369 15:22:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.369 15:22:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.369 15:22:16 -- accel/accel.sh@42 -- # jq -r . 00:06:31.369 [2024-12-02 15:22:16.804360] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:31.369 [2024-12-02 15:22:16.804437] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593628 ] 00:06:31.369 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.369 [2024-12-02 15:22:16.872238] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.369 [2024-12-02 15:22:16.937875] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=0x1 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=decompress 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=software 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@23 -- # accel_module=software 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=32 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=32 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=2 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val=Yes 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:31.369 15:22:16 -- accel/accel.sh@21 -- # val= 00:06:31.369 15:22:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # IFS=: 00:06:31.369 15:22:16 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@21 -- # val= 00:06:32.745 15:22:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # IFS=: 00:06:32.745 15:22:18 -- accel/accel.sh@20 -- # read -r var val 00:06:32.745 15:22:18 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:32.745 15:22:18 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:32.745 15:22:18 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:32.745 00:06:32.745 real 0m2.705s 00:06:32.745 user 0m2.455s 00:06:32.745 sys 0m0.258s 00:06:32.745 15:22:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:32.745 15:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:32.745 ************************************ 00:06:32.745 END TEST accel_deomp_full_mthread 00:06:32.745 ************************************ 00:06:32.745 15:22:18 -- accel/accel.sh@116 -- # [[ n == y ]] 00:06:32.745 15:22:18 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:32.745 15:22:18 -- accel/accel.sh@129 -- # build_accel_config 00:06:32.745 15:22:18 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:32.745 15:22:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:32.745 15:22:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:32.745 15:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:32.745 15:22:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:32.745 15:22:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:32.745 15:22:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:32.745 15:22:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:32.745 15:22:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:32.745 15:22:18 -- accel/accel.sh@42 -- # jq -r . 00:06:32.745 ************************************ 00:06:32.745 START TEST accel_dif_functional_tests 00:06:32.745 ************************************ 00:06:32.745 15:22:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:32.745 [2024-12-02 15:22:18.208729] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:32.745 [2024-12-02 15:22:18.208805] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593910 ] 00:06:32.745 EAL: No free 2048 kB hugepages reported on node 1 00:06:32.745 [2024-12-02 15:22:18.275945] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:32.745 [2024-12-02 15:22:18.346601] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:32.745 [2024-12-02 15:22:18.346698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:32.745 [2024-12-02 15:22:18.346698] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.745 00:06:32.745 00:06:32.745 CUnit - A unit testing framework for C - Version 2.1-3 00:06:32.745 http://cunit.sourceforge.net/ 00:06:32.745 00:06:32.745 00:06:32.745 Suite: accel_dif 00:06:32.745 Test: verify: DIF generated, GUARD check ...passed 00:06:32.745 Test: verify: DIF generated, APPTAG check ...passed 00:06:32.745 Test: verify: DIF generated, REFTAG check ...passed 00:06:32.746 Test: verify: DIF not generated, GUARD check ...[2024-12-02 15:22:18.415824] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:32.746 [2024-12-02 15:22:18.415874] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:32.746 passed 00:06:32.746 Test: verify: DIF not generated, APPTAG check ...[2024-12-02 15:22:18.415909] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:32.746 [2024-12-02 15:22:18.415928] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:32.746 passed 00:06:32.746 Test: verify: DIF not generated, REFTAG check ...[2024-12-02 15:22:18.415950] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:32.746 [2024-12-02 15:22:18.415968] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:32.746 passed 00:06:32.746 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:32.746 Test: verify: APPTAG incorrect, APPTAG check ...[2024-12-02 15:22:18.416014] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:32.746 passed 00:06:32.746 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:32.746 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:32.746 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:32.746 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-12-02 15:22:18.416122] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:32.746 passed 00:06:32.746 Test: generate copy: DIF generated, GUARD check ...passed 00:06:32.746 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:32.746 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:32.746 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:32.746 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:32.746 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:32.746 Test: generate copy: iovecs-len validate ...[2024-12-02 15:22:18.416307] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:32.746 passed 00:06:32.746 Test: generate copy: buffer alignment validate ...passed 00:06:32.746 00:06:32.746 Run Summary: Type Total Ran Passed Failed Inactive 00:06:32.746 suites 1 1 n/a 0 0 00:06:32.746 tests 20 20 20 0 0 00:06:32.746 asserts 204 204 204 0 n/a 00:06:32.746 00:06:32.746 Elapsed time = 0.002 seconds 00:06:33.005 00:06:33.005 real 0m0.392s 00:06:33.005 user 0m0.593s 00:06:33.005 sys 0m0.154s 00:06:33.005 15:22:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.005 15:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:33.005 ************************************ 00:06:33.005 END TEST accel_dif_functional_tests 00:06:33.005 ************************************ 00:06:33.005 00:06:33.005 real 0m57.259s 00:06:33.005 user 1m4.850s 00:06:33.005 sys 0m7.147s 00:06:33.005 15:22:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.005 15:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:33.005 ************************************ 00:06:33.005 END TEST accel 00:06:33.005 ************************************ 00:06:33.005 15:22:18 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:33.005 15:22:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.005 15:22:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.005 15:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:33.005 ************************************ 00:06:33.005 START TEST accel_rpc 00:06:33.005 ************************************ 00:06:33.005 15:22:18 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:33.005 * Looking for test storage... 00:06:33.005 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:33.005 15:22:18 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:33.005 15:22:18 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:33.005 15:22:18 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:33.005 15:22:18 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:33.005 15:22:18 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:33.005 15:22:18 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:33.005 15:22:18 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:33.005 15:22:18 -- scripts/common.sh@335 -- # IFS=.-: 00:06:33.005 15:22:18 -- scripts/common.sh@335 -- # read -ra ver1 00:06:33.005 15:22:18 -- scripts/common.sh@336 -- # IFS=.-: 00:06:33.005 15:22:18 -- scripts/common.sh@336 -- # read -ra ver2 00:06:33.005 15:22:18 -- scripts/common.sh@337 -- # local 'op=<' 00:06:33.005 15:22:18 -- scripts/common.sh@339 -- # ver1_l=2 00:06:33.005 15:22:18 -- scripts/common.sh@340 -- # ver2_l=1 00:06:33.005 15:22:18 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:33.005 15:22:18 -- scripts/common.sh@343 -- # case "$op" in 00:06:33.005 15:22:18 -- scripts/common.sh@344 -- # : 1 00:06:33.005 15:22:18 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:33.005 15:22:18 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:33.264 15:22:18 -- scripts/common.sh@364 -- # decimal 1 00:06:33.264 15:22:18 -- scripts/common.sh@352 -- # local d=1 00:06:33.264 15:22:18 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:33.264 15:22:18 -- scripts/common.sh@354 -- # echo 1 00:06:33.264 15:22:18 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:33.264 15:22:18 -- scripts/common.sh@365 -- # decimal 2 00:06:33.264 15:22:18 -- scripts/common.sh@352 -- # local d=2 00:06:33.264 15:22:18 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:33.264 15:22:18 -- scripts/common.sh@354 -- # echo 2 00:06:33.264 15:22:18 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:33.264 15:22:18 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:33.264 15:22:18 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:33.264 15:22:18 -- scripts/common.sh@367 -- # return 0 00:06:33.264 15:22:18 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:33.264 15:22:18 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:33.264 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.264 --rc genhtml_branch_coverage=1 00:06:33.264 --rc genhtml_function_coverage=1 00:06:33.264 --rc genhtml_legend=1 00:06:33.264 --rc geninfo_all_blocks=1 00:06:33.264 --rc geninfo_unexecuted_blocks=1 00:06:33.264 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.264 ' 00:06:33.264 15:22:18 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:33.264 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.264 --rc genhtml_branch_coverage=1 00:06:33.264 --rc genhtml_function_coverage=1 00:06:33.264 --rc genhtml_legend=1 00:06:33.264 --rc geninfo_all_blocks=1 00:06:33.264 --rc geninfo_unexecuted_blocks=1 00:06:33.264 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.264 ' 00:06:33.264 15:22:18 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:33.264 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.264 --rc genhtml_branch_coverage=1 00:06:33.264 --rc genhtml_function_coverage=1 00:06:33.264 --rc genhtml_legend=1 00:06:33.264 --rc geninfo_all_blocks=1 00:06:33.264 --rc geninfo_unexecuted_blocks=1 00:06:33.264 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.264 ' 00:06:33.264 15:22:18 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:33.264 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.264 --rc genhtml_branch_coverage=1 00:06:33.264 --rc genhtml_function_coverage=1 00:06:33.264 --rc genhtml_legend=1 00:06:33.264 --rc geninfo_all_blocks=1 00:06:33.264 --rc geninfo_unexecuted_blocks=1 00:06:33.264 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.264 ' 00:06:33.264 15:22:18 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:33.264 15:22:18 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1593996 00:06:33.264 15:22:18 -- accel/accel_rpc.sh@15 -- # waitforlisten 1593996 00:06:33.264 15:22:18 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:33.264 15:22:18 -- common/autotest_common.sh@829 -- # '[' -z 1593996 ']' 00:06:33.264 15:22:18 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:33.264 15:22:18 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:33.264 15:22:18 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:33.264 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:33.264 15:22:18 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:33.264 15:22:18 -- common/autotest_common.sh@10 -- # set +x 00:06:33.264 [2024-12-02 15:22:18.864264] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:33.264 [2024-12-02 15:22:18.864359] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1593996 ] 00:06:33.264 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.264 [2024-12-02 15:22:18.934175] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.264 [2024-12-02 15:22:19.007787] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:33.264 [2024-12-02 15:22:19.007908] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.200 15:22:19 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:34.200 15:22:19 -- common/autotest_common.sh@862 -- # return 0 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:34.200 15:22:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.200 15:22:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.200 15:22:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.200 ************************************ 00:06:34.200 START TEST accel_assign_opcode 00:06:34.200 ************************************ 00:06:34.200 15:22:19 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:34.200 15:22:19 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.200 15:22:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.200 [2024-12-02 15:22:19.705968] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:34.200 15:22:19 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:34.200 15:22:19 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.200 15:22:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.200 [2024-12-02 15:22:19.713983] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:34.200 15:22:19 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:34.200 15:22:19 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.200 15:22:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.200 15:22:19 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:34.200 15:22:19 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:34.200 15:22:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@42 -- # grep software 00:06:34.200 15:22:19 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:34.200 software 00:06:34.200 00:06:34.200 real 0m0.224s 00:06:34.200 user 0m0.038s 00:06:34.200 sys 0m0.010s 00:06:34.200 15:22:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.200 15:22:19 -- common/autotest_common.sh@10 -- # set +x 00:06:34.200 ************************************ 00:06:34.200 END TEST accel_assign_opcode 00:06:34.200 ************************************ 00:06:34.200 15:22:19 -- accel/accel_rpc.sh@55 -- # killprocess 1593996 00:06:34.200 15:22:19 -- common/autotest_common.sh@936 -- # '[' -z 1593996 ']' 00:06:34.200 15:22:19 -- common/autotest_common.sh@940 -- # kill -0 1593996 00:06:34.200 15:22:19 -- common/autotest_common.sh@941 -- # uname 00:06:34.200 15:22:19 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:34.200 15:22:19 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1593996 00:06:34.458 15:22:20 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:34.458 15:22:20 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:34.458 15:22:20 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1593996' 00:06:34.458 killing process with pid 1593996 00:06:34.458 15:22:20 -- common/autotest_common.sh@955 -- # kill 1593996 00:06:34.458 15:22:20 -- common/autotest_common.sh@960 -- # wait 1593996 00:06:34.720 00:06:34.720 real 0m1.677s 00:06:34.720 user 0m1.703s 00:06:34.720 sys 0m0.493s 00:06:34.720 15:22:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:34.720 15:22:20 -- common/autotest_common.sh@10 -- # set +x 00:06:34.720 ************************************ 00:06:34.720 END TEST accel_rpc 00:06:34.720 ************************************ 00:06:34.720 15:22:20 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:34.720 15:22:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.720 15:22:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.720 15:22:20 -- common/autotest_common.sh@10 -- # set +x 00:06:34.720 ************************************ 00:06:34.720 START TEST app_cmdline 00:06:34.720 ************************************ 00:06:34.720 15:22:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:34.720 * Looking for test storage... 00:06:34.720 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:34.720 15:22:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:34.720 15:22:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:34.720 15:22:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:35.008 15:22:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:35.008 15:22:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:35.008 15:22:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:35.008 15:22:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:35.008 15:22:20 -- scripts/common.sh@335 -- # IFS=.-: 00:06:35.008 15:22:20 -- scripts/common.sh@335 -- # read -ra ver1 00:06:35.008 15:22:20 -- scripts/common.sh@336 -- # IFS=.-: 00:06:35.008 15:22:20 -- scripts/common.sh@336 -- # read -ra ver2 00:06:35.008 15:22:20 -- scripts/common.sh@337 -- # local 'op=<' 00:06:35.008 15:22:20 -- scripts/common.sh@339 -- # ver1_l=2 00:06:35.008 15:22:20 -- scripts/common.sh@340 -- # ver2_l=1 00:06:35.008 15:22:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:35.008 15:22:20 -- scripts/common.sh@343 -- # case "$op" in 00:06:35.008 15:22:20 -- scripts/common.sh@344 -- # : 1 00:06:35.008 15:22:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:35.008 15:22:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:35.008 15:22:20 -- scripts/common.sh@364 -- # decimal 1 00:06:35.008 15:22:20 -- scripts/common.sh@352 -- # local d=1 00:06:35.008 15:22:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:35.008 15:22:20 -- scripts/common.sh@354 -- # echo 1 00:06:35.008 15:22:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:35.008 15:22:20 -- scripts/common.sh@365 -- # decimal 2 00:06:35.008 15:22:20 -- scripts/common.sh@352 -- # local d=2 00:06:35.008 15:22:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:35.008 15:22:20 -- scripts/common.sh@354 -- # echo 2 00:06:35.008 15:22:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:35.008 15:22:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:35.008 15:22:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:35.008 15:22:20 -- scripts/common.sh@367 -- # return 0 00:06:35.008 15:22:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:35.008 15:22:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:35.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.008 --rc genhtml_branch_coverage=1 00:06:35.008 --rc genhtml_function_coverage=1 00:06:35.008 --rc genhtml_legend=1 00:06:35.008 --rc geninfo_all_blocks=1 00:06:35.008 --rc geninfo_unexecuted_blocks=1 00:06:35.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.008 ' 00:06:35.008 15:22:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:35.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.008 --rc genhtml_branch_coverage=1 00:06:35.008 --rc genhtml_function_coverage=1 00:06:35.008 --rc genhtml_legend=1 00:06:35.008 --rc geninfo_all_blocks=1 00:06:35.008 --rc geninfo_unexecuted_blocks=1 00:06:35.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.008 ' 00:06:35.008 15:22:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:35.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.008 --rc genhtml_branch_coverage=1 00:06:35.008 --rc genhtml_function_coverage=1 00:06:35.008 --rc genhtml_legend=1 00:06:35.008 --rc geninfo_all_blocks=1 00:06:35.008 --rc geninfo_unexecuted_blocks=1 00:06:35.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.008 ' 00:06:35.008 15:22:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:35.008 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:35.008 --rc genhtml_branch_coverage=1 00:06:35.008 --rc genhtml_function_coverage=1 00:06:35.008 --rc genhtml_legend=1 00:06:35.008 --rc geninfo_all_blocks=1 00:06:35.008 --rc geninfo_unexecuted_blocks=1 00:06:35.008 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:35.008 ' 00:06:35.008 15:22:20 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:35.008 15:22:20 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1594420 00:06:35.008 15:22:20 -- app/cmdline.sh@18 -- # waitforlisten 1594420 00:06:35.008 15:22:20 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:35.008 15:22:20 -- common/autotest_common.sh@829 -- # '[' -z 1594420 ']' 00:06:35.008 15:22:20 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.008 15:22:20 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:35.008 15:22:20 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.008 15:22:20 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:35.008 15:22:20 -- common/autotest_common.sh@10 -- # set +x 00:06:35.008 [2024-12-02 15:22:20.604612] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:35.008 [2024-12-02 15:22:20.604704] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1594420 ] 00:06:35.008 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.008 [2024-12-02 15:22:20.673864] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.008 [2024-12-02 15:22:20.747771] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:35.008 [2024-12-02 15:22:20.747873] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:35.614 15:22:21 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:35.614 15:22:21 -- common/autotest_common.sh@862 -- # return 0 00:06:35.615 15:22:21 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:35.872 { 00:06:35.872 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:06:35.872 "fields": { 00:06:35.872 "major": 24, 00:06:35.872 "minor": 1, 00:06:35.872 "patch": 1, 00:06:35.872 "suffix": "-pre", 00:06:35.872 "commit": "c13c99a5e" 00:06:35.872 } 00:06:35.872 } 00:06:35.872 15:22:21 -- app/cmdline.sh@22 -- # expected_methods=() 00:06:35.872 15:22:21 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:35.872 15:22:21 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:35.872 15:22:21 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:35.872 15:22:21 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:35.872 15:22:21 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:35.872 15:22:21 -- app/cmdline.sh@26 -- # sort 00:06:35.872 15:22:21 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:35.872 15:22:21 -- common/autotest_common.sh@10 -- # set +x 00:06:35.872 15:22:21 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:35.872 15:22:21 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:35.872 15:22:21 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:35.872 15:22:21 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:35.872 15:22:21 -- common/autotest_common.sh@650 -- # local es=0 00:06:35.872 15:22:21 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:35.872 15:22:21 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.872 15:22:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.872 15:22:21 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.872 15:22:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.872 15:22:21 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.873 15:22:21 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:35.873 15:22:21 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:35.873 15:22:21 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:06:35.873 15:22:21 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:36.130 request: 00:06:36.131 { 00:06:36.131 "method": "env_dpdk_get_mem_stats", 00:06:36.131 "req_id": 1 00:06:36.131 } 00:06:36.131 Got JSON-RPC error response 00:06:36.131 response: 00:06:36.131 { 00:06:36.131 "code": -32601, 00:06:36.131 "message": "Method not found" 00:06:36.131 } 00:06:36.131 15:22:21 -- common/autotest_common.sh@653 -- # es=1 00:06:36.131 15:22:21 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:36.131 15:22:21 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:36.131 15:22:21 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:36.131 15:22:21 -- app/cmdline.sh@1 -- # killprocess 1594420 00:06:36.131 15:22:21 -- common/autotest_common.sh@936 -- # '[' -z 1594420 ']' 00:06:36.131 15:22:21 -- common/autotest_common.sh@940 -- # kill -0 1594420 00:06:36.131 15:22:21 -- common/autotest_common.sh@941 -- # uname 00:06:36.131 15:22:21 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:36.131 15:22:21 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 1594420 00:06:36.131 15:22:21 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:36.131 15:22:21 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:36.131 15:22:21 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 1594420' 00:06:36.131 killing process with pid 1594420 00:06:36.131 15:22:21 -- common/autotest_common.sh@955 -- # kill 1594420 00:06:36.131 15:22:21 -- common/autotest_common.sh@960 -- # wait 1594420 00:06:36.388 00:06:36.388 real 0m1.804s 00:06:36.388 user 0m2.078s 00:06:36.388 sys 0m0.508s 00:06:36.388 15:22:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.388 15:22:22 -- common/autotest_common.sh@10 -- # set +x 00:06:36.389 ************************************ 00:06:36.389 END TEST app_cmdline 00:06:36.389 ************************************ 00:06:36.647 15:22:22 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:36.647 15:22:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:36.647 15:22:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.647 15:22:22 -- common/autotest_common.sh@10 -- # set +x 00:06:36.647 ************************************ 00:06:36.647 START TEST version 00:06:36.647 ************************************ 00:06:36.647 15:22:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:36.647 * Looking for test storage... 00:06:36.647 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:36.647 15:22:22 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:36.647 15:22:22 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:36.647 15:22:22 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:36.647 15:22:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:36.647 15:22:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:36.647 15:22:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:36.647 15:22:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:36.647 15:22:22 -- scripts/common.sh@335 -- # IFS=.-: 00:06:36.647 15:22:22 -- scripts/common.sh@335 -- # read -ra ver1 00:06:36.647 15:22:22 -- scripts/common.sh@336 -- # IFS=.-: 00:06:36.647 15:22:22 -- scripts/common.sh@336 -- # read -ra ver2 00:06:36.647 15:22:22 -- scripts/common.sh@337 -- # local 'op=<' 00:06:36.647 15:22:22 -- scripts/common.sh@339 -- # ver1_l=2 00:06:36.647 15:22:22 -- scripts/common.sh@340 -- # ver2_l=1 00:06:36.647 15:22:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:36.647 15:22:22 -- scripts/common.sh@343 -- # case "$op" in 00:06:36.647 15:22:22 -- scripts/common.sh@344 -- # : 1 00:06:36.647 15:22:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:36.647 15:22:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:36.647 15:22:22 -- scripts/common.sh@364 -- # decimal 1 00:06:36.647 15:22:22 -- scripts/common.sh@352 -- # local d=1 00:06:36.647 15:22:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:36.647 15:22:22 -- scripts/common.sh@354 -- # echo 1 00:06:36.647 15:22:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:36.647 15:22:22 -- scripts/common.sh@365 -- # decimal 2 00:06:36.647 15:22:22 -- scripts/common.sh@352 -- # local d=2 00:06:36.647 15:22:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:36.647 15:22:22 -- scripts/common.sh@354 -- # echo 2 00:06:36.647 15:22:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:36.647 15:22:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:36.647 15:22:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:36.647 15:22:22 -- scripts/common.sh@367 -- # return 0 00:06:36.647 15:22:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:36.647 15:22:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:36.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.647 --rc genhtml_branch_coverage=1 00:06:36.647 --rc genhtml_function_coverage=1 00:06:36.647 --rc genhtml_legend=1 00:06:36.647 --rc geninfo_all_blocks=1 00:06:36.647 --rc geninfo_unexecuted_blocks=1 00:06:36.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.647 ' 00:06:36.647 15:22:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:36.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.647 --rc genhtml_branch_coverage=1 00:06:36.647 --rc genhtml_function_coverage=1 00:06:36.647 --rc genhtml_legend=1 00:06:36.647 --rc geninfo_all_blocks=1 00:06:36.647 --rc geninfo_unexecuted_blocks=1 00:06:36.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.647 ' 00:06:36.647 15:22:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:36.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.647 --rc genhtml_branch_coverage=1 00:06:36.647 --rc genhtml_function_coverage=1 00:06:36.647 --rc genhtml_legend=1 00:06:36.647 --rc geninfo_all_blocks=1 00:06:36.647 --rc geninfo_unexecuted_blocks=1 00:06:36.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.647 ' 00:06:36.647 15:22:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:36.647 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:36.647 --rc genhtml_branch_coverage=1 00:06:36.647 --rc genhtml_function_coverage=1 00:06:36.647 --rc genhtml_legend=1 00:06:36.647 --rc geninfo_all_blocks=1 00:06:36.647 --rc geninfo_unexecuted_blocks=1 00:06:36.647 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:36.647 ' 00:06:36.647 15:22:22 -- app/version.sh@17 -- # get_header_version major 00:06:36.648 15:22:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:36.648 15:22:22 -- app/version.sh@14 -- # cut -f2 00:06:36.648 15:22:22 -- app/version.sh@14 -- # tr -d '"' 00:06:36.648 15:22:22 -- app/version.sh@17 -- # major=24 00:06:36.648 15:22:22 -- app/version.sh@18 -- # get_header_version minor 00:06:36.648 15:22:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:36.648 15:22:22 -- app/version.sh@14 -- # cut -f2 00:06:36.648 15:22:22 -- app/version.sh@14 -- # tr -d '"' 00:06:36.648 15:22:22 -- app/version.sh@18 -- # minor=1 00:06:36.648 15:22:22 -- app/version.sh@19 -- # get_header_version patch 00:06:36.648 15:22:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:36.648 15:22:22 -- app/version.sh@14 -- # cut -f2 00:06:36.648 15:22:22 -- app/version.sh@14 -- # tr -d '"' 00:06:36.648 15:22:22 -- app/version.sh@19 -- # patch=1 00:06:36.648 15:22:22 -- app/version.sh@20 -- # get_header_version suffix 00:06:36.648 15:22:22 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:36.648 15:22:22 -- app/version.sh@14 -- # cut -f2 00:06:36.648 15:22:22 -- app/version.sh@14 -- # tr -d '"' 00:06:36.648 15:22:22 -- app/version.sh@20 -- # suffix=-pre 00:06:36.648 15:22:22 -- app/version.sh@22 -- # version=24.1 00:06:36.648 15:22:22 -- app/version.sh@25 -- # (( patch != 0 )) 00:06:36.648 15:22:22 -- app/version.sh@25 -- # version=24.1.1 00:06:36.648 15:22:22 -- app/version.sh@28 -- # version=24.1.1rc0 00:06:36.648 15:22:22 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:36.648 15:22:22 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:36.906 15:22:22 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:06:36.906 15:22:22 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:06:36.906 00:06:36.906 real 0m0.259s 00:06:36.906 user 0m0.145s 00:06:36.906 sys 0m0.168s 00:06:36.907 15:22:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:36.907 15:22:22 -- common/autotest_common.sh@10 -- # set +x 00:06:36.907 ************************************ 00:06:36.907 END TEST version 00:06:36.907 ************************************ 00:06:36.907 15:22:22 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@191 -- # uname -s 00:06:36.907 15:22:22 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:06:36.907 15:22:22 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:06:36.907 15:22:22 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:06:36.907 15:22:22 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@255 -- # timing_exit lib 00:06:36.907 15:22:22 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:36.907 15:22:22 -- common/autotest_common.sh@10 -- # set +x 00:06:36.907 15:22:22 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:06:36.907 15:22:22 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:06:36.907 15:22:22 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:06:36.907 15:22:22 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:06:36.907 15:22:22 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:36.907 15:22:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:36.907 15:22:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:36.907 15:22:22 -- common/autotest_common.sh@10 -- # set +x 00:06:36.907 ************************************ 00:06:36.907 START TEST llvm_fuzz 00:06:36.907 ************************************ 00:06:36.907 15:22:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:36.907 * Looking for test storage... 00:06:36.907 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:06:36.907 15:22:22 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:36.907 15:22:22 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:36.907 15:22:22 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:37.167 15:22:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:37.167 15:22:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:37.167 15:22:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:37.167 15:22:22 -- scripts/common.sh@335 -- # IFS=.-: 00:06:37.167 15:22:22 -- scripts/common.sh@335 -- # read -ra ver1 00:06:37.167 15:22:22 -- scripts/common.sh@336 -- # IFS=.-: 00:06:37.167 15:22:22 -- scripts/common.sh@336 -- # read -ra ver2 00:06:37.167 15:22:22 -- scripts/common.sh@337 -- # local 'op=<' 00:06:37.167 15:22:22 -- scripts/common.sh@339 -- # ver1_l=2 00:06:37.167 15:22:22 -- scripts/common.sh@340 -- # ver2_l=1 00:06:37.167 15:22:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:37.167 15:22:22 -- scripts/common.sh@343 -- # case "$op" in 00:06:37.167 15:22:22 -- scripts/common.sh@344 -- # : 1 00:06:37.167 15:22:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:37.167 15:22:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:37.167 15:22:22 -- scripts/common.sh@364 -- # decimal 1 00:06:37.167 15:22:22 -- scripts/common.sh@352 -- # local d=1 00:06:37.167 15:22:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:37.167 15:22:22 -- scripts/common.sh@354 -- # echo 1 00:06:37.167 15:22:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:37.167 15:22:22 -- scripts/common.sh@365 -- # decimal 2 00:06:37.167 15:22:22 -- scripts/common.sh@352 -- # local d=2 00:06:37.167 15:22:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:37.167 15:22:22 -- scripts/common.sh@354 -- # echo 2 00:06:37.167 15:22:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:37.167 15:22:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:37.167 15:22:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:37.167 15:22:22 -- scripts/common.sh@367 -- # return 0 00:06:37.167 15:22:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:37.167 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.167 --rc genhtml_branch_coverage=1 00:06:37.167 --rc genhtml_function_coverage=1 00:06:37.167 --rc genhtml_legend=1 00:06:37.167 --rc geninfo_all_blocks=1 00:06:37.167 --rc geninfo_unexecuted_blocks=1 00:06:37.167 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.167 ' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:37.167 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.167 --rc genhtml_branch_coverage=1 00:06:37.167 --rc genhtml_function_coverage=1 00:06:37.167 --rc genhtml_legend=1 00:06:37.167 --rc geninfo_all_blocks=1 00:06:37.167 --rc geninfo_unexecuted_blocks=1 00:06:37.167 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.167 ' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:37.167 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.167 --rc genhtml_branch_coverage=1 00:06:37.167 --rc genhtml_function_coverage=1 00:06:37.167 --rc genhtml_legend=1 00:06:37.167 --rc geninfo_all_blocks=1 00:06:37.167 --rc geninfo_unexecuted_blocks=1 00:06:37.167 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.167 ' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:37.167 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.167 --rc genhtml_branch_coverage=1 00:06:37.167 --rc genhtml_function_coverage=1 00:06:37.167 --rc genhtml_legend=1 00:06:37.167 --rc geninfo_all_blocks=1 00:06:37.167 --rc geninfo_unexecuted_blocks=1 00:06:37.167 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.167 ' 00:06:37.167 15:22:22 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:06:37.167 15:22:22 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:06:37.167 15:22:22 -- common/autotest_common.sh@548 -- # fuzzers=() 00:06:37.167 15:22:22 -- common/autotest_common.sh@548 -- # local fuzzers 00:06:37.167 15:22:22 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:06:37.167 15:22:22 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:06:37.167 15:22:22 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:06:37.167 15:22:22 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:06:37.167 15:22:22 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:06:37.167 15:22:22 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:06:37.167 15:22:22 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:37.167 15:22:22 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:37.167 15:22:22 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:37.167 15:22:22 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:37.167 15:22:22 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:37.167 15:22:22 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:37.167 15:22:22 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:37.167 15:22:22 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:37.167 15:22:22 -- common/autotest_common.sh@10 -- # set +x 00:06:37.167 ************************************ 00:06:37.167 START TEST nvmf_fuzz 00:06:37.167 ************************************ 00:06:37.167 15:22:22 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:37.167 * Looking for test storage... 00:06:37.167 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:37.167 15:22:22 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:37.167 15:22:22 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:37.167 15:22:22 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:37.167 15:22:22 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:37.167 15:22:22 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:37.167 15:22:22 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:37.167 15:22:22 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:37.167 15:22:22 -- scripts/common.sh@335 -- # IFS=.-: 00:06:37.167 15:22:22 -- scripts/common.sh@335 -- # read -ra ver1 00:06:37.167 15:22:22 -- scripts/common.sh@336 -- # IFS=.-: 00:06:37.167 15:22:22 -- scripts/common.sh@336 -- # read -ra ver2 00:06:37.167 15:22:22 -- scripts/common.sh@337 -- # local 'op=<' 00:06:37.167 15:22:22 -- scripts/common.sh@339 -- # ver1_l=2 00:06:37.167 15:22:22 -- scripts/common.sh@340 -- # ver2_l=1 00:06:37.167 15:22:22 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:37.167 15:22:22 -- scripts/common.sh@343 -- # case "$op" in 00:06:37.167 15:22:22 -- scripts/common.sh@344 -- # : 1 00:06:37.168 15:22:22 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:37.168 15:22:22 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:37.168 15:22:22 -- scripts/common.sh@364 -- # decimal 1 00:06:37.168 15:22:22 -- scripts/common.sh@352 -- # local d=1 00:06:37.168 15:22:22 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:37.168 15:22:22 -- scripts/common.sh@354 -- # echo 1 00:06:37.168 15:22:22 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:37.168 15:22:22 -- scripts/common.sh@365 -- # decimal 2 00:06:37.168 15:22:22 -- scripts/common.sh@352 -- # local d=2 00:06:37.168 15:22:22 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:37.168 15:22:22 -- scripts/common.sh@354 -- # echo 2 00:06:37.168 15:22:22 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:37.168 15:22:22 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:37.168 15:22:22 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:37.168 15:22:22 -- scripts/common.sh@367 -- # return 0 00:06:37.168 15:22:22 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:37.168 15:22:22 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:37.168 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.168 --rc genhtml_branch_coverage=1 00:06:37.168 --rc genhtml_function_coverage=1 00:06:37.168 --rc genhtml_legend=1 00:06:37.168 --rc geninfo_all_blocks=1 00:06:37.168 --rc geninfo_unexecuted_blocks=1 00:06:37.168 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.168 ' 00:06:37.168 15:22:22 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:37.168 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.168 --rc genhtml_branch_coverage=1 00:06:37.168 --rc genhtml_function_coverage=1 00:06:37.168 --rc genhtml_legend=1 00:06:37.168 --rc geninfo_all_blocks=1 00:06:37.168 --rc geninfo_unexecuted_blocks=1 00:06:37.168 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.168 ' 00:06:37.168 15:22:22 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:37.168 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.168 --rc genhtml_branch_coverage=1 00:06:37.168 --rc genhtml_function_coverage=1 00:06:37.168 --rc genhtml_legend=1 00:06:37.168 --rc geninfo_all_blocks=1 00:06:37.168 --rc geninfo_unexecuted_blocks=1 00:06:37.168 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.168 ' 00:06:37.168 15:22:22 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:37.168 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.168 --rc genhtml_branch_coverage=1 00:06:37.168 --rc genhtml_function_coverage=1 00:06:37.168 --rc genhtml_legend=1 00:06:37.168 --rc geninfo_all_blocks=1 00:06:37.168 --rc geninfo_unexecuted_blocks=1 00:06:37.168 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.168 ' 00:06:37.168 15:22:22 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:06:37.168 15:22:22 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:06:37.168 15:22:22 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:06:37.168 15:22:22 -- common/autotest_common.sh@34 -- # set -e 00:06:37.168 15:22:22 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:06:37.168 15:22:22 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:06:37.168 15:22:22 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:06:37.168 15:22:22 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:06:37.168 15:22:22 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:06:37.168 15:22:22 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:06:37.168 15:22:22 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:06:37.168 15:22:22 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:06:37.168 15:22:22 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:06:37.168 15:22:22 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:06:37.168 15:22:22 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:06:37.168 15:22:22 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:06:37.168 15:22:22 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:06:37.168 15:22:22 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:06:37.168 15:22:22 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:06:37.168 15:22:22 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:06:37.168 15:22:22 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:06:37.168 15:22:22 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:06:37.168 15:22:22 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:06:37.168 15:22:22 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:06:37.168 15:22:22 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:06:37.168 15:22:22 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:06:37.168 15:22:22 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:37.168 15:22:22 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:06:37.168 15:22:22 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:06:37.431 15:22:22 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:06:37.431 15:22:22 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:06:37.431 15:22:22 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:06:37.431 15:22:22 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:06:37.431 15:22:22 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:06:37.431 15:22:22 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:06:37.431 15:22:22 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:06:37.431 15:22:22 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:06:37.431 15:22:22 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:06:37.431 15:22:22 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:06:37.431 15:22:22 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:06:37.431 15:22:22 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:06:37.431 15:22:22 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:06:37.431 15:22:22 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:06:37.431 15:22:22 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:37.431 15:22:22 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:06:37.431 15:22:22 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:06:37.431 15:22:22 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:06:37.431 15:22:22 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:06:37.431 15:22:22 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:06:37.431 15:22:22 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:06:37.431 15:22:22 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:06:37.431 15:22:22 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:06:37.431 15:22:22 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:06:37.431 15:22:22 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:06:37.431 15:22:22 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:06:37.431 15:22:22 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:06:37.431 15:22:22 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:06:37.431 15:22:22 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:06:37.431 15:22:22 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:06:37.431 15:22:22 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:06:37.431 15:22:22 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:06:37.431 15:22:22 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:06:37.431 15:22:22 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:06:37.431 15:22:22 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:06:37.431 15:22:22 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:06:37.431 15:22:22 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:06:37.431 15:22:22 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:06:37.431 15:22:22 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:06:37.431 15:22:22 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:06:37.431 15:22:22 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:06:37.431 15:22:22 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:06:37.431 15:22:22 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:06:37.431 15:22:22 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:06:37.431 15:22:22 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:06:37.431 15:22:22 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:06:37.431 15:22:22 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:06:37.431 15:22:22 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:06:37.431 15:22:22 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:06:37.431 15:22:22 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:06:37.431 15:22:22 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:06:37.431 15:22:22 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:06:37.431 15:22:22 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:06:37.431 15:22:22 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:06:37.431 15:22:22 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:06:37.431 15:22:22 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:06:37.431 15:22:22 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:06:37.431 15:22:22 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:06:37.431 15:22:22 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:37.431 15:22:22 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:37.431 15:22:23 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:37.431 15:22:23 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:37.431 15:22:23 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:37.431 15:22:23 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:37.431 15:22:23 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:37.431 15:22:23 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:37.431 15:22:23 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:06:37.431 15:22:23 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:06:37.431 15:22:23 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:06:37.431 15:22:23 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:06:37.431 15:22:23 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:06:37.432 15:22:23 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:06:37.432 15:22:23 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:06:37.432 15:22:23 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:06:37.432 #define SPDK_CONFIG_H 00:06:37.432 #define SPDK_CONFIG_APPS 1 00:06:37.432 #define SPDK_CONFIG_ARCH native 00:06:37.432 #undef SPDK_CONFIG_ASAN 00:06:37.432 #undef SPDK_CONFIG_AVAHI 00:06:37.432 #undef SPDK_CONFIG_CET 00:06:37.432 #define SPDK_CONFIG_COVERAGE 1 00:06:37.432 #define SPDK_CONFIG_CROSS_PREFIX 00:06:37.432 #undef SPDK_CONFIG_CRYPTO 00:06:37.432 #undef SPDK_CONFIG_CRYPTO_MLX5 00:06:37.432 #undef SPDK_CONFIG_CUSTOMOCF 00:06:37.432 #undef SPDK_CONFIG_DAOS 00:06:37.432 #define SPDK_CONFIG_DAOS_DIR 00:06:37.432 #define SPDK_CONFIG_DEBUG 1 00:06:37.432 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:06:37.432 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:37.432 #define SPDK_CONFIG_DPDK_INC_DIR 00:06:37.432 #define SPDK_CONFIG_DPDK_LIB_DIR 00:06:37.432 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:06:37.432 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:37.432 #define SPDK_CONFIG_EXAMPLES 1 00:06:37.432 #undef SPDK_CONFIG_FC 00:06:37.432 #define SPDK_CONFIG_FC_PATH 00:06:37.432 #define SPDK_CONFIG_FIO_PLUGIN 1 00:06:37.432 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:06:37.432 #undef SPDK_CONFIG_FUSE 00:06:37.432 #define SPDK_CONFIG_FUZZER 1 00:06:37.432 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:06:37.432 #undef SPDK_CONFIG_GOLANG 00:06:37.432 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:06:37.432 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:06:37.432 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:06:37.432 #undef SPDK_CONFIG_HAVE_LIBBSD 00:06:37.432 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:06:37.432 #define SPDK_CONFIG_IDXD 1 00:06:37.432 #define SPDK_CONFIG_IDXD_KERNEL 1 00:06:37.432 #undef SPDK_CONFIG_IPSEC_MB 00:06:37.432 #define SPDK_CONFIG_IPSEC_MB_DIR 00:06:37.432 #define SPDK_CONFIG_ISAL 1 00:06:37.432 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:06:37.432 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:06:37.432 #define SPDK_CONFIG_LIBDIR 00:06:37.432 #undef SPDK_CONFIG_LTO 00:06:37.432 #define SPDK_CONFIG_MAX_LCORES 00:06:37.432 #define SPDK_CONFIG_NVME_CUSE 1 00:06:37.432 #undef SPDK_CONFIG_OCF 00:06:37.432 #define SPDK_CONFIG_OCF_PATH 00:06:37.432 #define SPDK_CONFIG_OPENSSL_PATH 00:06:37.432 #undef SPDK_CONFIG_PGO_CAPTURE 00:06:37.432 #undef SPDK_CONFIG_PGO_USE 00:06:37.432 #define SPDK_CONFIG_PREFIX /usr/local 00:06:37.432 #undef SPDK_CONFIG_RAID5F 00:06:37.432 #undef SPDK_CONFIG_RBD 00:06:37.432 #define SPDK_CONFIG_RDMA 1 00:06:37.432 #define SPDK_CONFIG_RDMA_PROV verbs 00:06:37.432 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:06:37.432 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:06:37.432 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:06:37.432 #undef SPDK_CONFIG_SHARED 00:06:37.432 #undef SPDK_CONFIG_SMA 00:06:37.432 #define SPDK_CONFIG_TESTS 1 00:06:37.432 #undef SPDK_CONFIG_TSAN 00:06:37.432 #define SPDK_CONFIG_UBLK 1 00:06:37.432 #define SPDK_CONFIG_UBSAN 1 00:06:37.432 #undef SPDK_CONFIG_UNIT_TESTS 00:06:37.432 #undef SPDK_CONFIG_URING 00:06:37.432 #define SPDK_CONFIG_URING_PATH 00:06:37.432 #undef SPDK_CONFIG_URING_ZNS 00:06:37.432 #undef SPDK_CONFIG_USDT 00:06:37.432 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:06:37.432 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:06:37.432 #define SPDK_CONFIG_VFIO_USER 1 00:06:37.432 #define SPDK_CONFIG_VFIO_USER_DIR 00:06:37.432 #define SPDK_CONFIG_VHOST 1 00:06:37.432 #define SPDK_CONFIG_VIRTIO 1 00:06:37.432 #undef SPDK_CONFIG_VTUNE 00:06:37.432 #define SPDK_CONFIG_VTUNE_DIR 00:06:37.432 #define SPDK_CONFIG_WERROR 1 00:06:37.432 #define SPDK_CONFIG_WPDK_DIR 00:06:37.432 #undef SPDK_CONFIG_XNVME 00:06:37.432 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:06:37.432 15:22:23 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:06:37.432 15:22:23 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:37.432 15:22:23 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:37.432 15:22:23 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:37.432 15:22:23 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:37.432 15:22:23 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.432 15:22:23 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.432 15:22:23 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.432 15:22:23 -- paths/export.sh@5 -- # export PATH 00:06:37.432 15:22:23 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:37.432 15:22:23 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:37.432 15:22:23 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:37.432 15:22:23 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:37.432 15:22:23 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:37.432 15:22:23 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:06:37.432 15:22:23 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:37.432 15:22:23 -- pm/common@16 -- # TEST_TAG=N/A 00:06:37.432 15:22:23 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:06:37.432 15:22:23 -- common/autotest_common.sh@52 -- # : 1 00:06:37.432 15:22:23 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:06:37.432 15:22:23 -- common/autotest_common.sh@56 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:06:37.432 15:22:23 -- common/autotest_common.sh@58 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:06:37.432 15:22:23 -- common/autotest_common.sh@60 -- # : 1 00:06:37.432 15:22:23 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:06:37.432 15:22:23 -- common/autotest_common.sh@62 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:06:37.432 15:22:23 -- common/autotest_common.sh@64 -- # : 00:06:37.432 15:22:23 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:06:37.432 15:22:23 -- common/autotest_common.sh@66 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:06:37.432 15:22:23 -- common/autotest_common.sh@68 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:06:37.432 15:22:23 -- common/autotest_common.sh@70 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:06:37.432 15:22:23 -- common/autotest_common.sh@72 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:06:37.432 15:22:23 -- common/autotest_common.sh@74 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:06:37.432 15:22:23 -- common/autotest_common.sh@76 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:06:37.432 15:22:23 -- common/autotest_common.sh@78 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:06:37.432 15:22:23 -- common/autotest_common.sh@80 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:06:37.432 15:22:23 -- common/autotest_common.sh@82 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:06:37.432 15:22:23 -- common/autotest_common.sh@84 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:06:37.432 15:22:23 -- common/autotest_common.sh@86 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:06:37.432 15:22:23 -- common/autotest_common.sh@88 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:06:37.432 15:22:23 -- common/autotest_common.sh@90 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:06:37.432 15:22:23 -- common/autotest_common.sh@92 -- # : 1 00:06:37.432 15:22:23 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:06:37.432 15:22:23 -- common/autotest_common.sh@94 -- # : 1 00:06:37.432 15:22:23 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:06:37.432 15:22:23 -- common/autotest_common.sh@96 -- # : rdma 00:06:37.432 15:22:23 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:06:37.432 15:22:23 -- common/autotest_common.sh@98 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:06:37.432 15:22:23 -- common/autotest_common.sh@100 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:06:37.432 15:22:23 -- common/autotest_common.sh@102 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:06:37.432 15:22:23 -- common/autotest_common.sh@104 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:06:37.432 15:22:23 -- common/autotest_common.sh@106 -- # : 0 00:06:37.432 15:22:23 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:06:37.432 15:22:23 -- common/autotest_common.sh@108 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:06:37.433 15:22:23 -- common/autotest_common.sh@110 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:06:37.433 15:22:23 -- common/autotest_common.sh@112 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:06:37.433 15:22:23 -- common/autotest_common.sh@114 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:06:37.433 15:22:23 -- common/autotest_common.sh@116 -- # : 1 00:06:37.433 15:22:23 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:06:37.433 15:22:23 -- common/autotest_common.sh@118 -- # : 00:06:37.433 15:22:23 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:06:37.433 15:22:23 -- common/autotest_common.sh@120 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:06:37.433 15:22:23 -- common/autotest_common.sh@122 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:06:37.433 15:22:23 -- common/autotest_common.sh@124 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:06:37.433 15:22:23 -- common/autotest_common.sh@126 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:06:37.433 15:22:23 -- common/autotest_common.sh@128 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:06:37.433 15:22:23 -- common/autotest_common.sh@130 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:06:37.433 15:22:23 -- common/autotest_common.sh@132 -- # : 00:06:37.433 15:22:23 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:06:37.433 15:22:23 -- common/autotest_common.sh@134 -- # : true 00:06:37.433 15:22:23 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:06:37.433 15:22:23 -- common/autotest_common.sh@136 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:06:37.433 15:22:23 -- common/autotest_common.sh@138 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:06:37.433 15:22:23 -- common/autotest_common.sh@140 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:06:37.433 15:22:23 -- common/autotest_common.sh@142 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:06:37.433 15:22:23 -- common/autotest_common.sh@144 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:06:37.433 15:22:23 -- common/autotest_common.sh@146 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:06:37.433 15:22:23 -- common/autotest_common.sh@148 -- # : 00:06:37.433 15:22:23 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:06:37.433 15:22:23 -- common/autotest_common.sh@150 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:06:37.433 15:22:23 -- common/autotest_common.sh@152 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:06:37.433 15:22:23 -- common/autotest_common.sh@154 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:06:37.433 15:22:23 -- common/autotest_common.sh@156 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:06:37.433 15:22:23 -- common/autotest_common.sh@158 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:06:37.433 15:22:23 -- common/autotest_common.sh@160 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:06:37.433 15:22:23 -- common/autotest_common.sh@163 -- # : 00:06:37.433 15:22:23 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:06:37.433 15:22:23 -- common/autotest_common.sh@165 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:06:37.433 15:22:23 -- common/autotest_common.sh@167 -- # : 0 00:06:37.433 15:22:23 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:06:37.433 15:22:23 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:37.433 15:22:23 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:06:37.433 15:22:23 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:06:37.433 15:22:23 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:37.433 15:22:23 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:37.433 15:22:23 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:06:37.433 15:22:23 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:06:37.433 15:22:23 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:37.433 15:22:23 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:37.433 15:22:23 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:37.433 15:22:23 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:37.433 15:22:23 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:06:37.433 15:22:23 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:06:37.433 15:22:23 -- common/autotest_common.sh@196 -- # cat 00:06:37.433 15:22:23 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:06:37.433 15:22:23 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:37.433 15:22:23 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:37.433 15:22:23 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:37.433 15:22:23 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:37.433 15:22:23 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:06:37.433 15:22:23 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:06:37.433 15:22:23 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:37.433 15:22:23 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:37.433 15:22:23 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:37.433 15:22:23 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:37.433 15:22:23 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:37.433 15:22:23 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:37.433 15:22:23 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:37.433 15:22:23 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:37.433 15:22:23 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:37.433 15:22:23 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:37.433 15:22:23 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:37.433 15:22:23 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:37.433 15:22:23 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:06:37.433 15:22:23 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:06:37.433 15:22:23 -- common/autotest_common.sh@249 -- # _LCOV= 00:06:37.433 15:22:23 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:06:37.433 15:22:23 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:06:37.433 15:22:23 -- common/autotest_common.sh@250 -- # _LCOV=1 00:06:37.433 15:22:23 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:06:37.433 15:22:23 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:06:37.433 15:22:23 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:06:37.433 15:22:23 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:06:37.434 15:22:23 -- common/autotest_common.sh@259 -- # export valgrind= 00:06:37.434 15:22:23 -- common/autotest_common.sh@259 -- # valgrind= 00:06:37.434 15:22:23 -- common/autotest_common.sh@265 -- # uname -s 00:06:37.434 15:22:23 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:06:37.434 15:22:23 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:06:37.434 15:22:23 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:06:37.434 15:22:23 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:06:37.434 15:22:23 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@275 -- # MAKE=make 00:06:37.434 15:22:23 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:06:37.434 15:22:23 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:06:37.434 15:22:23 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:06:37.434 15:22:23 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:06:37.434 15:22:23 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:06:37.434 15:22:23 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:06:37.434 15:22:23 -- common/autotest_common.sh@319 -- # [[ -z 1595038 ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@319 -- # kill -0 1595038 00:06:37.434 15:22:23 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:06:37.434 15:22:23 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:06:37.434 15:22:23 -- common/autotest_common.sh@332 -- # local mount target_dir 00:06:37.434 15:22:23 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:06:37.434 15:22:23 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:06:37.434 15:22:23 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:06:37.434 15:22:23 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:06:37.434 15:22:23 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.o7axwL 00:06:37.434 15:22:23 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:06:37.434 15:22:23 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.o7axwL/tests/nvmf /tmp/spdk.o7axwL 00:06:37.434 15:22:23 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@328 -- # df -T 00:06:37.434 15:22:23 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=54447513600 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730574336 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=7283060736 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864027648 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865285120 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864945152 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=344064 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:06:37.434 15:22:23 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:06:37.434 15:22:23 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:06:37.434 15:22:23 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:37.434 15:22:23 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:06:37.434 * Looking for test storage... 00:06:37.434 15:22:23 -- common/autotest_common.sh@369 -- # local target_space new_size 00:06:37.434 15:22:23 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:06:37.434 15:22:23 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:37.434 15:22:23 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:06:37.434 15:22:23 -- common/autotest_common.sh@373 -- # mount=/ 00:06:37.434 15:22:23 -- common/autotest_common.sh@375 -- # target_space=54447513600 00:06:37.434 15:22:23 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:06:37.434 15:22:23 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:06:37.434 15:22:23 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@382 -- # new_size=9497653248 00:06:37.434 15:22:23 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:06:37.434 15:22:23 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:37.434 15:22:23 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:37.434 15:22:23 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:37.434 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:37.434 15:22:23 -- common/autotest_common.sh@390 -- # return 0 00:06:37.434 15:22:23 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:06:37.434 15:22:23 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:06:37.434 15:22:23 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:06:37.434 15:22:23 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:06:37.434 15:22:23 -- common/autotest_common.sh@1682 -- # true 00:06:37.434 15:22:23 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:06:37.434 15:22:23 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@27 -- # exec 00:06:37.434 15:22:23 -- common/autotest_common.sh@29 -- # exec 00:06:37.434 15:22:23 -- common/autotest_common.sh@31 -- # xtrace_restore 00:06:37.434 15:22:23 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:06:37.434 15:22:23 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:06:37.434 15:22:23 -- common/autotest_common.sh@18 -- # set -x 00:06:37.434 15:22:23 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:37.434 15:22:23 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:37.434 15:22:23 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:37.434 15:22:23 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:37.434 15:22:23 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:37.434 15:22:23 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:37.434 15:22:23 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:37.434 15:22:23 -- scripts/common.sh@335 -- # IFS=.-: 00:06:37.434 15:22:23 -- scripts/common.sh@335 -- # read -ra ver1 00:06:37.434 15:22:23 -- scripts/common.sh@336 -- # IFS=.-: 00:06:37.434 15:22:23 -- scripts/common.sh@336 -- # read -ra ver2 00:06:37.434 15:22:23 -- scripts/common.sh@337 -- # local 'op=<' 00:06:37.434 15:22:23 -- scripts/common.sh@339 -- # ver1_l=2 00:06:37.434 15:22:23 -- scripts/common.sh@340 -- # ver2_l=1 00:06:37.434 15:22:23 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:37.434 15:22:23 -- scripts/common.sh@343 -- # case "$op" in 00:06:37.434 15:22:23 -- scripts/common.sh@344 -- # : 1 00:06:37.434 15:22:23 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:37.434 15:22:23 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:37.434 15:22:23 -- scripts/common.sh@364 -- # decimal 1 00:06:37.434 15:22:23 -- scripts/common.sh@352 -- # local d=1 00:06:37.434 15:22:23 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:37.434 15:22:23 -- scripts/common.sh@354 -- # echo 1 00:06:37.434 15:22:23 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:37.434 15:22:23 -- scripts/common.sh@365 -- # decimal 2 00:06:37.434 15:22:23 -- scripts/common.sh@352 -- # local d=2 00:06:37.434 15:22:23 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:37.434 15:22:23 -- scripts/common.sh@354 -- # echo 2 00:06:37.434 15:22:23 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:37.434 15:22:23 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:37.435 15:22:23 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:37.435 15:22:23 -- scripts/common.sh@367 -- # return 0 00:06:37.435 15:22:23 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:37.435 15:22:23 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:37.435 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.435 --rc genhtml_branch_coverage=1 00:06:37.435 --rc genhtml_function_coverage=1 00:06:37.435 --rc genhtml_legend=1 00:06:37.435 --rc geninfo_all_blocks=1 00:06:37.435 --rc geninfo_unexecuted_blocks=1 00:06:37.435 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.435 ' 00:06:37.435 15:22:23 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:37.435 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.435 --rc genhtml_branch_coverage=1 00:06:37.435 --rc genhtml_function_coverage=1 00:06:37.435 --rc genhtml_legend=1 00:06:37.435 --rc geninfo_all_blocks=1 00:06:37.435 --rc geninfo_unexecuted_blocks=1 00:06:37.435 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.435 ' 00:06:37.435 15:22:23 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:37.435 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.435 --rc genhtml_branch_coverage=1 00:06:37.435 --rc genhtml_function_coverage=1 00:06:37.435 --rc genhtml_legend=1 00:06:37.435 --rc geninfo_all_blocks=1 00:06:37.435 --rc geninfo_unexecuted_blocks=1 00:06:37.435 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.435 ' 00:06:37.435 15:22:23 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:37.435 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:37.435 --rc genhtml_branch_coverage=1 00:06:37.435 --rc genhtml_function_coverage=1 00:06:37.435 --rc genhtml_legend=1 00:06:37.435 --rc geninfo_all_blocks=1 00:06:37.435 --rc geninfo_unexecuted_blocks=1 00:06:37.435 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:37.435 ' 00:06:37.435 15:22:23 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:06:37.435 15:22:23 -- ../common.sh@8 -- # pids=() 00:06:37.435 15:22:23 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:37.435 15:22:23 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:37.435 15:22:23 -- nvmf/run.sh@56 -- # fuzz_num=25 00:06:37.435 15:22:23 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:06:37.435 15:22:23 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:06:37.435 15:22:23 -- nvmf/run.sh@61 -- # mem_size=512 00:06:37.435 15:22:23 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:06:37.435 15:22:23 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:06:37.435 15:22:23 -- ../common.sh@69 -- # local fuzz_num=25 00:06:37.435 15:22:23 -- ../common.sh@70 -- # local time=1 00:06:37.435 15:22:23 -- ../common.sh@72 -- # (( i = 0 )) 00:06:37.435 15:22:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:37.435 15:22:23 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:06:37.435 15:22:23 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:06:37.435 15:22:23 -- nvmf/run.sh@24 -- # local timen=1 00:06:37.435 15:22:23 -- nvmf/run.sh@25 -- # local core=0x1 00:06:37.435 15:22:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:37.435 15:22:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:06:37.435 15:22:23 -- nvmf/run.sh@29 -- # printf %02d 0 00:06:37.435 15:22:23 -- nvmf/run.sh@29 -- # port=4400 00:06:37.435 15:22:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:37.435 15:22:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:06:37.435 15:22:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:37.435 15:22:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:06:37.694 [2024-12-02 15:22:23.258024] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.694 [2024-12-02 15:22:23.258079] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1595097 ] 00:06:37.694 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.952 [2024-12-02 15:22:23.516377] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.952 [2024-12-02 15:22:23.602806] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:37.952 [2024-12-02 15:22:23.602935] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.952 [2024-12-02 15:22:23.661429] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:37.952 [2024-12-02 15:22:23.677776] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:06:37.952 INFO: Running with entropic power schedule (0xFF, 100). 00:06:37.952 INFO: Seed: 1031731007 00:06:37.952 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:37.953 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:37.953 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:37.953 INFO: A corpus is not provided, starting from an empty corpus 00:06:37.953 #2 INITED exec/s: 0 rss: 60Mb 00:06:37.953 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:37.953 This may also happen if the target rejected all inputs we tried so far 00:06:37.953 [2024-12-02 15:22:23.747685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:c1c1c1c1 cdw11:c1c1c1c1 00:06:37.953 [2024-12-02 15:22:23.747722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 NEW_FUNC[1/669]: 0x43a858 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:06:38.469 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:38.469 #6 NEW cov: 11524 ft: 11524 corp: 2/72b lim: 320 exec/s: 0 rss: 68Mb L: 71/71 MS: 4 CMP-CMP-ChangeBit-InsertRepeatedBytes- DE: "\000\226K\244L\017\370\202"-"+\000\000\000"- 00:06:38.469 [2024-12-02 15:22:24.087841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.469 [2024-12-02 15:22:24.087880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 NEW_FUNC[1/2]: 0x12c5228 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2016 00:06:38.469 NEW_FUNC[2/2]: 0x16c34e8 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:06:38.469 #16 NEW cov: 11691 ft: 12286 corp: 3/158b lim: 320 exec/s: 0 rss: 69Mb L: 86/86 MS: 5 CopyPart-EraseBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:06:38.469 [2024-12-02 15:22:24.127910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.469 [2024-12-02 15:22:24.127938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 #17 NEW cov: 11697 ft: 12723 corp: 4/245b lim: 320 exec/s: 0 rss: 69Mb L: 87/87 MS: 1 InsertByte- 00:06:38.469 [2024-12-02 15:22:24.167956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.469 [2024-12-02 15:22:24.167981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 #18 NEW cov: 11782 ft: 12965 corp: 5/331b lim: 320 exec/s: 0 rss: 69Mb L: 86/87 MS: 1 ChangeByte- 00:06:38.469 [2024-12-02 15:22:24.208126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.469 [2024-12-02 15:22:24.208153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 #19 NEW cov: 11782 ft: 13001 corp: 6/418b lim: 320 exec/s: 0 rss: 69Mb L: 87/87 MS: 1 InsertByte- 00:06:38.469 [2024-12-02 15:22:24.248260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.469 [2024-12-02 15:22:24.248286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 #20 NEW cov: 11782 ft: 13168 corp: 7/490b lim: 320 exec/s: 0 rss: 69Mb L: 72/87 MS: 1 EraseBytes- 00:06:38.469 [2024-12-02 15:22:24.288556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:38.469 [2024-12-02 15:22:24.288581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.469 [2024-12-02 15:22:24.288704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.469 [2024-12-02 15:22:24.288719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.728 #21 NEW cov: 11782 ft: 13402 corp: 8/662b lim: 320 exec/s: 0 rss: 69Mb L: 172/172 MS: 1 CopyPart- 00:06:38.728 [2024-12-02 15:22:24.328617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ad) qid:0 cid:4 nsid:ffffffff cdw10:ffff3dff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:38.728 [2024-12-02 15:22:24.328645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.728 NEW_FUNC[1/1]: 0x16c4058 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:06:38.728 #24 NEW cov: 11796 ft: 13763 corp: 9/750b lim: 320 exec/s: 0 rss: 69Mb L: 88/172 MS: 3 ChangeByte-ChangeBit-CrossOver- 00:06:38.728 [2024-12-02 15:22:24.368853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:38.728 [2024-12-02 15:22:24.368880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.728 [2024-12-02 15:22:24.369015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.728 [2024-12-02 15:22:24.369034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.728 #25 NEW cov: 11796 ft: 13829 corp: 10/928b lim: 320 exec/s: 0 rss: 69Mb L: 178/178 MS: 1 InsertRepeatedBytes- 00:06:38.728 [2024-12-02 15:22:24.408748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:c1c1c1c1 cdw11:c1c1c1c1 00:06:38.728 [2024-12-02 15:22:24.408774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.728 #26 NEW cov: 11796 ft: 13854 corp: 11/999b lim: 320 exec/s: 0 rss: 69Mb L: 71/178 MS: 1 ChangeByte- 00:06:38.728 [2024-12-02 15:22:24.448866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.728 [2024-12-02 15:22:24.448890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.728 #27 NEW cov: 11796 ft: 13883 corp: 12/1086b lim: 320 exec/s: 0 rss: 69Mb L: 87/178 MS: 1 PersAutoDict- DE: "\000\226K\244L\017\370\202"- 00:06:38.728 [2024-12-02 15:22:24.489171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:38.728 [2024-12-02 15:22:24.489199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.728 [2024-12-02 15:22:24.489335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.728 [2024-12-02 15:22:24.489353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.728 #28 NEW cov: 11796 ft: 13918 corp: 13/1259b lim: 320 exec/s: 0 rss: 69Mb L: 173/178 MS: 1 InsertByte- 00:06:38.728 [2024-12-02 15:22:24.529244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:38.728 [2024-12-02 15:22:24.529271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.728 [2024-12-02 15:22:24.529409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.728 [2024-12-02 15:22:24.529425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.988 #29 NEW cov: 11796 ft: 14004 corp: 14/1441b lim: 320 exec/s: 0 rss: 69Mb L: 182/182 MS: 1 CrossOver- 00:06:38.988 [2024-12-02 15:22:24.569195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.988 [2024-12-02 15:22:24.569221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.988 #30 NEW cov: 11796 ft: 14030 corp: 15/1513b lim: 320 exec/s: 0 rss: 69Mb L: 72/182 MS: 1 ShuffleBytes- 00:06:38.988 [2024-12-02 15:22:24.609536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x2a2a2a2a2a2a2a2a 00:06:38.988 [2024-12-02 15:22:24.609562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.988 [2024-12-02 15:22:24.609686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:5 nsid:2a2a2a2a cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x2a2a2a2a2a2a2a2a 00:06:38.988 [2024-12-02 15:22:24.609703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.988 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:38.988 #31 NEW cov: 11819 ft: 14088 corp: 16/1680b lim: 320 exec/s: 0 rss: 69Mb L: 167/182 MS: 1 InsertRepeatedBytes- 00:06:38.988 [2024-12-02 15:22:24.659480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ff19ffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.988 [2024-12-02 15:22:24.659506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.988 #32 NEW cov: 11819 ft: 14135 corp: 17/1767b lim: 320 exec/s: 0 rss: 70Mb L: 87/182 MS: 1 ChangeByte- 00:06:38.988 [2024-12-02 15:22:24.699612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:2bffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.988 [2024-12-02 15:22:24.699638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.988 #33 NEW cov: 11819 ft: 14140 corp: 18/1853b lim: 320 exec/s: 0 rss: 70Mb L: 86/182 MS: 1 PersAutoDict- DE: "+\000\000\000"- 00:06:38.988 [2024-12-02 15:22:24.729678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ff19ffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:38.988 [2024-12-02 15:22:24.729703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.988 #34 NEW cov: 11819 ft: 14148 corp: 19/1944b lim: 320 exec/s: 34 rss: 70Mb L: 91/182 MS: 1 PersAutoDict- DE: "+\000\000\000"- 00:06:38.988 [2024-12-02 15:22:24.769828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:ffffffc4 cdw11:ffffffff 00:06:38.988 [2024-12-02 15:22:24.769853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.988 #35 NEW cov: 11819 ft: 14180 corp: 20/2015b lim: 320 exec/s: 35 rss: 70Mb L: 71/182 MS: 1 CrossOver- 00:06:39.246 [2024-12-02 15:22:24.810033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:ffffffff cdw11:ffffffff 00:06:39.246 [2024-12-02 15:22:24.810059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.247 #36 NEW cov: 11819 ft: 14190 corp: 21/2086b lim: 320 exec/s: 36 rss: 70Mb L: 71/182 MS: 1 CrossOver- 00:06:39.247 [2024-12-02 15:22:24.850284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ff19ffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:24.850310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.247 [2024-12-02 15:22:24.850441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:24.850457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.247 #37 NEW cov: 11819 ft: 14231 corp: 22/2217b lim: 320 exec/s: 37 rss: 70Mb L: 131/182 MS: 1 InsertRepeatedBytes- 00:06:39.247 [2024-12-02 15:22:24.890567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:39.247 [2024-12-02 15:22:24.890592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.247 [2024-12-02 15:22:24.890718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffff3dff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:24.890737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.247 [2024-12-02 15:22:24.890866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:24.890881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.247 #38 NEW cov: 11819 ft: 14445 corp: 23/2414b lim: 320 exec/s: 38 rss: 70Mb L: 197/197 MS: 1 CrossOver- 00:06:39.247 [2024-12-02 15:22:24.930508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:39.247 [2024-12-02 15:22:24.930533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.247 [2024-12-02 15:22:24.930666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:24.930682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.247 #39 NEW cov: 11819 ft: 14458 corp: 24/2596b lim: 320 exec/s: 39 rss: 70Mb L: 182/197 MS: 1 ShuffleBytes- 00:06:39.247 [2024-12-02 15:22:24.970499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:ffffffff cdw11:ffffffc4 00:06:39.247 [2024-12-02 15:22:24.970525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.247 #40 NEW cov: 11819 ft: 14472 corp: 25/2671b lim: 320 exec/s: 40 rss: 70Mb L: 75/197 MS: 1 PersAutoDict- DE: "+\000\000\000"- 00:06:39.247 [2024-12-02 15:22:25.010944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:39.247 [2024-12-02 15:22:25.010971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.247 [2024-12-02 15:22:25.011094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffff3dff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:25.011110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.247 [2024-12-02 15:22:25.011233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:25.011248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.247 #41 NEW cov: 11819 ft: 14479 corp: 26/2868b lim: 320 exec/s: 41 rss: 70Mb L: 197/197 MS: 1 ShuffleBytes- 00:06:39.247 [2024-12-02 15:22:25.050729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.247 [2024-12-02 15:22:25.050761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 #42 NEW cov: 11819 ft: 14491 corp: 27/2954b lim: 320 exec/s: 42 rss: 70Mb L: 86/197 MS: 1 CopyPart- 00:06:39.504 [2024-12-02 15:22:25.091009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.504 [2024-12-02 15:22:25.091035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 [2024-12-02 15:22:25.091160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:91919191 cdw11:91919191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9191919191919191 00:06:39.504 [2024-12-02 15:22:25.091176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.504 #43 NEW cov: 11819 ft: 14534 corp: 28/3132b lim: 320 exec/s: 43 rss: 70Mb L: 178/197 MS: 1 InsertRepeatedBytes- 00:06:39.504 [2024-12-02 15:22:25.130993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.504 [2024-12-02 15:22:25.131021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 #44 NEW cov: 11819 ft: 14537 corp: 29/3218b lim: 320 exec/s: 44 rss: 70Mb L: 86/197 MS: 1 CopyPart- 00:06:39.504 [2024-12-02 15:22:25.161154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:92929292 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9292929292929292 00:06:39.504 [2024-12-02 15:22:25.161180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 #45 NEW cov: 11836 ft: 14580 corp: 30/3304b lim: 320 exec/s: 45 rss: 70Mb L: 86/197 MS: 1 InsertRepeatedBytes- 00:06:39.504 [2024-12-02 15:22:25.201387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x2a2a2a2a2a2a2a2a 00:06:39.504 [2024-12-02 15:22:25.201414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 [2024-12-02 15:22:25.201548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:5 nsid:2a2a2a2a cdw10:2a2a2a2a cdw11:2a2a2a2a SGL TRANSPORT DATA BLOCK TRANSPORT 0x2a2a2a2a2a2a2a2a 00:06:39.504 [2024-12-02 15:22:25.201564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.504 #46 NEW cov: 11836 ft: 14596 corp: 31/3471b lim: 320 exec/s: 46 rss: 70Mb L: 167/197 MS: 1 CopyPart- 00:06:39.504 [2024-12-02 15:22:25.251540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ad) qid:0 cid:4 nsid:ffffffff cdw10:ffff3dff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:39.504 [2024-12-02 15:22:25.251566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 #47 NEW cov: 11836 ft: 14612 corp: 32/3560b lim: 320 exec/s: 47 rss: 70Mb L: 89/197 MS: 1 InsertByte- 00:06:39.504 [2024-12-02 15:22:25.301742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:c1c1c1c1 cdw11:c1c1c1c1 00:06:39.504 [2024-12-02 15:22:25.301768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.504 [2024-12-02 15:22:25.301875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:39.504 [2024-12-02 15:22:25.301891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.504 #48 NEW cov: 11836 ft: 14680 corp: 33/3694b lim: 320 exec/s: 48 rss: 70Mb L: 134/197 MS: 1 InsertRepeatedBytes- 00:06:39.761 [2024-12-02 15:22:25.341837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.761 [2024-12-02 15:22:25.341864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.761 [2024-12-02 15:22:25.341989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:91919191 cdw11:91919191 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9191919191919191 00:06:39.761 [2024-12-02 15:22:25.342005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.761 #49 NEW cov: 11836 ft: 14713 corp: 34/3872b lim: 320 exec/s: 49 rss: 70Mb L: 178/197 MS: 1 ShuffleBytes- 00:06:39.761 [2024-12-02 15:22:25.391844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.761 [2024-12-02 15:22:25.391875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.761 #50 NEW cov: 11836 ft: 14724 corp: 35/3959b lim: 320 exec/s: 50 rss: 70Mb L: 87/197 MS: 1 ChangeBit- 00:06:39.761 [2024-12-02 15:22:25.431926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.761 [2024-12-02 15:22:25.431953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.761 #51 NEW cov: 11836 ft: 14751 corp: 36/4045b lim: 320 exec/s: 51 rss: 70Mb L: 86/197 MS: 1 ChangeByte- 00:06:39.761 [2024-12-02 15:22:25.472012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.761 [2024-12-02 15:22:25.472039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.761 #52 NEW cov: 11836 ft: 14753 corp: 37/4154b lim: 320 exec/s: 52 rss: 70Mb L: 109/197 MS: 1 EraseBytes- 00:06:39.761 [2024-12-02 15:22:25.512157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ff19ffff cdw10:ffffffff cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:39.761 [2024-12-02 15:22:25.512186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.761 #53 NEW cov: 11836 ft: 14758 corp: 38/4243b lim: 320 exec/s: 53 rss: 70Mb L: 89/197 MS: 1 CrossOver- 00:06:39.761 [2024-12-02 15:22:25.552359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:c1c1c1c1 cdw11:c1c1c1c1 00:06:39.761 [2024-12-02 15:22:25.552387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.761 #54 NEW cov: 11836 ft: 14763 corp: 39/4315b lim: 320 exec/s: 54 rss: 70Mb L: 72/197 MS: 1 InsertByte- 00:06:40.019 [2024-12-02 15:22:25.582498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:92929292 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9292929292928292 00:06:40.019 [2024-12-02 15:22:25.582525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.019 #55 NEW cov: 11836 ft: 14766 corp: 40/4401b lim: 320 exec/s: 55 rss: 70Mb L: 86/197 MS: 1 ChangeBit- 00:06:40.019 [2024-12-02 15:22:25.622607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:82f80000 cdw10:c1c1c1c1 cdw11:c1c1c1c1 00:06:40.019 [2024-12-02 15:22:25.622632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.019 #56 NEW cov: 11836 ft: 14777 corp: 41/4493b lim: 320 exec/s: 56 rss: 70Mb L: 92/197 MS: 1 CopyPart- 00:06:40.019 [2024-12-02 15:22:25.652821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffff2bffffffff 00:06:40.019 [2024-12-02 15:22:25.652847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.019 [2024-12-02 15:22:25.652985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:40.019 [2024-12-02 15:22:25.653000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.019 #57 NEW cov: 11836 ft: 14786 corp: 42/4675b lim: 320 exec/s: 57 rss: 70Mb L: 182/197 MS: 1 CMP- DE: "\013\000"- 00:06:40.019 [2024-12-02 15:22:25.692795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:40.019 [2024-12-02 15:22:25.692823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.019 [2024-12-02 15:22:25.692979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:40.019 [2024-12-02 15:22:25.692996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.019 #59 NEW cov: 11836 ft: 14796 corp: 43/4828b lim: 320 exec/s: 59 rss: 70Mb L: 153/197 MS: 2 PersAutoDict-CrossOver- DE: "\013\000"- 00:06:40.019 [2024-12-02 15:22:25.732766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2b) qid:0 cid:4 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:40.019 [2024-12-02 15:22:25.732791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.019 #60 NEW cov: 11836 ft: 14800 corp: 44/4915b lim: 320 exec/s: 30 rss: 70Mb L: 87/197 MS: 1 ChangeBinInt- 00:06:40.019 #60 DONE cov: 11836 ft: 14800 corp: 44/4915b lim: 320 exec/s: 30 rss: 70Mb 00:06:40.019 ###### Recommended dictionary. ###### 00:06:40.019 "\000\226K\244L\017\370\202" # Uses: 1 00:06:40.019 "+\000\000\000" # Uses: 3 00:06:40.019 "\013\000" # Uses: 1 00:06:40.019 ###### End of recommended dictionary. ###### 00:06:40.019 Done 60 runs in 2 second(s) 00:06:40.277 15:22:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:06:40.277 15:22:25 -- ../common.sh@72 -- # (( i++ )) 00:06:40.277 15:22:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:40.277 15:22:25 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:06:40.277 15:22:25 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:06:40.277 15:22:25 -- nvmf/run.sh@24 -- # local timen=1 00:06:40.278 15:22:25 -- nvmf/run.sh@25 -- # local core=0x1 00:06:40.278 15:22:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:40.278 15:22:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:06:40.278 15:22:25 -- nvmf/run.sh@29 -- # printf %02d 1 00:06:40.278 15:22:25 -- nvmf/run.sh@29 -- # port=4401 00:06:40.278 15:22:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:40.278 15:22:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:06:40.278 15:22:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:40.278 15:22:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:06:40.278 [2024-12-02 15:22:25.918270] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:40.278 [2024-12-02 15:22:25.918341] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1595642 ] 00:06:40.278 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.536 [2024-12-02 15:22:26.168794] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.536 [2024-12-02 15:22:26.258194] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:40.536 [2024-12-02 15:22:26.258314] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.536 [2024-12-02 15:22:26.316358] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:40.536 [2024-12-02 15:22:26.332699] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:06:40.536 INFO: Running with entropic power schedule (0xFF, 100). 00:06:40.536 INFO: Seed: 3687729038 00:06:40.794 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:40.794 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:40.794 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:40.794 INFO: A corpus is not provided, starting from an empty corpus 00:06:40.794 #2 INITED exec/s: 0 rss: 60Mb 00:06:40.795 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:40.795 This may also happen if the target rejected all inputs we tried so far 00:06:40.795 [2024-12-02 15:22:26.378034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.795 [2024-12-02 15:22:26.378067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.795 [2024-12-02 15:22:26.378121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.795 [2024-12-02 15:22:26.378136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:40.795 [2024-12-02 15:22:26.378189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.795 [2024-12-02 15:22:26.378203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.054 NEW_FUNC[1/671]: 0x43b158 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:06:41.054 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:41.054 #3 NEW cov: 11647 ft: 11648 corp: 2/20b lim: 30 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:06:41.054 [2024-12-02 15:22:26.698573] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:41.054 [2024-12-02 15:22:26.699111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.699143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.699201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.699216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.699273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.699287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.699343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.699357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.054 #5 NEW cov: 11769 ft: 12613 corp: 3/48b lim: 30 exec/s: 0 rss: 68Mb L: 28/28 MS: 2 InsertByte-InsertRepeatedBytes- 00:06:41.054 [2024-12-02 15:22:26.738591] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (73876) > buf size (4096) 00:06:41.054 [2024-12-02 15:22:26.739109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4824000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.739137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.739193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.739212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.739267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.739281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.739336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.739350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.054 #6 NEW cov: 11775 ft: 12841 corp: 4/77b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 InsertByte- 00:06:41.054 [2024-12-02 15:22:26.778671] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.054 [2024-12-02 15:22:26.778797] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.054 [2024-12-02 15:22:26.778909] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.054 [2024-12-02 15:22:26.779018] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.054 [2024-12-02 15:22:26.779232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a5d83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.779259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.779317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.779332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.779387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.779401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.054 [2024-12-02 15:22:26.779457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.054 [2024-12-02 15:22:26.779471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.054 #9 NEW cov: 11866 ft: 13111 corp: 5/106b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 3 InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:06:41.054 [2024-12-02 15:22:26.818817] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.055 [2024-12-02 15:22:26.818934] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.055 [2024-12-02 15:22:26.819046] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.055 [2024-12-02 15:22:26.819256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.055 [2024-12-02 15:22:26.819284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.055 [2024-12-02 15:22:26.819343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.055 [2024-12-02 15:22:26.819359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.055 [2024-12-02 15:22:26.819415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.055 [2024-12-02 15:22:26.819431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.055 #10 NEW cov: 11866 ft: 13184 corp: 6/128b lim: 30 exec/s: 0 rss: 69Mb L: 22/29 MS: 1 InsertRepeatedBytes- 00:06:41.055 [2024-12-02 15:22:26.848868] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000024ff 00:06:41.055 [2024-12-02 15:22:26.848985] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.055 [2024-12-02 15:22:26.849094] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.055 [2024-12-02 15:22:26.849316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.055 [2024-12-02 15:22:26.849343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.055 [2024-12-02 15:22:26.849399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.055 [2024-12-02 15:22:26.849414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.055 [2024-12-02 15:22:26.849468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.055 [2024-12-02 15:22:26.849482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 #11 NEW cov: 11866 ft: 13390 corp: 7/151b lim: 30 exec/s: 0 rss: 69Mb L: 23/29 MS: 1 InsertByte- 00:06:41.314 [2024-12-02 15:22:26.888999] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000024ff 00:06:41.314 [2024-12-02 15:22:26.889117] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.314 [2024-12-02 15:22:26.889229] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.314 [2024-12-02 15:22:26.889336] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:06:41.314 [2024-12-02 15:22:26.889565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.889592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.889649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:caff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.889664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.889721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.889739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.889795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.889811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.314 #12 NEW cov: 11866 ft: 13513 corp: 8/175b lim: 30 exec/s: 0 rss: 69Mb L: 24/29 MS: 1 InsertByte- 00:06:41.314 [2024-12-02 15:22:26.939135] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:41.314 [2024-12-02 15:22:26.939649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.939677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.939739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.939754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.939808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.939822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.939877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.939890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.314 #13 NEW cov: 11866 ft: 13544 corp: 9/203b lim: 30 exec/s: 0 rss: 69Mb L: 28/29 MS: 1 ShuffleBytes- 00:06:41.314 [2024-12-02 15:22:26.979305] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (73876) > buf size (4096) 00:06:41.314 [2024-12-02 15:22:26.979946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4824000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.979974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.980032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.980047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.980103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.980118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.980174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.980188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:26.980243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:26.980257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:41.314 #19 NEW cov: 11866 ft: 13640 corp: 10/233b lim: 30 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 CopyPart- 00:06:41.314 [2024-12-02 15:22:27.029457] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (73876) > buf size (4096) 00:06:41.314 [2024-12-02 15:22:27.029786] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1d 00:06:41.314 [2024-12-02 15:22:27.030015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4824000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.030042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.030101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.030115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.030175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.030189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.030246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.030261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.314 #20 NEW cov: 11866 ft: 13703 corp: 11/262b lim: 30 exec/s: 0 rss: 69Mb L: 29/30 MS: 1 ChangeBinInt- 00:06:41.314 [2024-12-02 15:22:27.069556] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000024ff 00:06:41.314 [2024-12-02 15:22:27.069675] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.314 [2024-12-02 15:22:27.069791] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.314 [2024-12-02 15:22:27.069896] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.314 [2024-12-02 15:22:27.070121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.070148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.070205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff2483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.070221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.070267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.070282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.070340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.070354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.314 #21 NEW cov: 11866 ft: 13722 corp: 12/291b lim: 30 exec/s: 0 rss: 69Mb L: 29/30 MS: 1 CopyPart- 00:06:41.314 [2024-12-02 15:22:27.109653] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:41.314 [2024-12-02 15:22:27.109779] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x75 00:06:41.314 [2024-12-02 15:22:27.110186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.110212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.110269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.110284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.110339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.110353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.314 [2024-12-02 15:22:27.110409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.314 [2024-12-02 15:22:27.110427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.314 #22 NEW cov: 11866 ft: 13757 corp: 13/319b lim: 30 exec/s: 0 rss: 69Mb L: 28/30 MS: 1 ChangeByte- 00:06:41.573 [2024-12-02 15:22:27.149776] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.149894] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.150006] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.150216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.150243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.150299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.150314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.150371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.150385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.573 #23 NEW cov: 11866 ft: 13833 corp: 14/341b lim: 30 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 CopyPart- 00:06:41.573 [2024-12-02 15:22:27.189878] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.189997] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.190106] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000bfff 00:06:41.573 [2024-12-02 15:22:27.190330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.190357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.190415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.190429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.190484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.190498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.573 #24 NEW cov: 11866 ft: 13941 corp: 15/363b lim: 30 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 ChangeByte- 00:06:41.573 [2024-12-02 15:22:27.229902] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.230120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.230145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.573 #25 NEW cov: 11866 ft: 14413 corp: 16/374b lim: 30 exec/s: 0 rss: 69Mb L: 11/30 MS: 1 EraseBytes- 00:06:41.573 [2024-12-02 15:22:27.270153] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:41.573 [2024-12-02 15:22:27.270272] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x75 00:06:41.573 [2024-12-02 15:22:27.270686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.270713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.270771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.270786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.270842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.270856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.270910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.270924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.573 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:41.573 #26 NEW cov: 11889 ft: 14452 corp: 17/402b lim: 30 exec/s: 0 rss: 69Mb L: 28/30 MS: 1 ShuffleBytes- 00:06:41.573 [2024-12-02 15:22:27.320221] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.320336] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.320445] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.320659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.320686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.320743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.320759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.320814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.320829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.573 #27 NEW cov: 11889 ft: 14498 corp: 18/424b lim: 30 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 CopyPart- 00:06:41.573 [2024-12-02 15:22:27.360339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffbf 00:06:41.573 [2024-12-02 15:22:27.360455] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.360562] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.573 [2024-12-02 15:22:27.360767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.360794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.360851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.360866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.573 [2024-12-02 15:22:27.360923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.573 [2024-12-02 15:22:27.360937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.573 #28 NEW cov: 11889 ft: 14529 corp: 19/446b lim: 30 exec/s: 28 rss: 69Mb L: 22/30 MS: 1 ChangeBit- 00:06:41.832 [2024-12-02 15:22:27.400505] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:41.832 [2024-12-02 15:22:27.400624] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x75 00:06:41.832 [2024-12-02 15:22:27.400740] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:06:41.832 [2024-12-02 15:22:27.401060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.401087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.401143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.401158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.401215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.401229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.401283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.401296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.832 #29 NEW cov: 11889 ft: 14574 corp: 20/474b lim: 30 exec/s: 29 rss: 69Mb L: 28/30 MS: 1 ChangeBit- 00:06:41.832 [2024-12-02 15:22:27.440570] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.832 [2024-12-02 15:22:27.440687] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.832 [2024-12-02 15:22:27.440802] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.832 [2024-12-02 15:22:27.441025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.441052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.441107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.441122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.441176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.441190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.832 #30 NEW cov: 11889 ft: 14608 corp: 21/496b lim: 30 exec/s: 30 rss: 70Mb L: 22/30 MS: 1 CrossOver- 00:06:41.832 [2024-12-02 15:22:27.480719] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:41.832 [2024-12-02 15:22:27.480842] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x75 00:06:41.832 [2024-12-02 15:22:27.481278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.481308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.481367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.481382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.481438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.481453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.832 [2024-12-02 15:22:27.481506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.832 [2024-12-02 15:22:27.481521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.832 #31 NEW cov: 11889 ft: 14662 corp: 22/524b lim: 30 exec/s: 31 rss: 70Mb L: 28/30 MS: 1 ChangeBinInt- 00:06:41.832 [2024-12-02 15:22:27.520781] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000024ff 00:06:41.832 [2024-12-02 15:22:27.520900] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.832 [2024-12-02 15:22:27.521135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.521162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.521218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.521233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.833 #32 NEW cov: 11889 ft: 14898 corp: 23/538b lim: 30 exec/s: 32 rss: 70Mb L: 14/30 MS: 1 EraseBytes- 00:06:41.833 [2024-12-02 15:22:27.560963] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (73876) > buf size (4096) 00:06:41.833 [2024-12-02 15:22:27.561274] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262148) > buf size (4096) 00:06:41.833 [2024-12-02 15:22:27.561500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:4824000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.561527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.561584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.561599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.561653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.561667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.561723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.561742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.833 #33 NEW cov: 11889 ft: 14910 corp: 24/565b lim: 30 exec/s: 33 rss: 70Mb L: 27/30 MS: 1 EraseBytes- 00:06:41.833 [2024-12-02 15:22:27.601077] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000e2e2 00:06:41.833 [2024-12-02 15:22:27.601197] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.833 [2024-12-02 15:22:27.601307] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.833 [2024-12-02 15:22:27.601414] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.833 [2024-12-02 15:22:27.601641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffe202e2 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.601667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.601723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:e2ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.601742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.601798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.601812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.601867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.601881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:41.833 #34 NEW cov: 11889 ft: 14917 corp: 25/593b lim: 30 exec/s: 34 rss: 70Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:06:41.833 [2024-12-02 15:22:27.641209] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000024ff 00:06:41.833 [2024-12-02 15:22:27.641327] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.833 [2024-12-02 15:22:27.641436] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.833 [2024-12-02 15:22:27.641545] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:41.833 [2024-12-02 15:22:27.641766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.641792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.641851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff2483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.641866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.641919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff8323 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.641934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.833 [2024-12-02 15:22:27.641987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.833 [2024-12-02 15:22:27.642002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.091 #35 NEW cov: 11889 ft: 14928 corp: 26/622b lim: 30 exec/s: 35 rss: 70Mb L: 29/30 MS: 1 ChangeByte- 00:06:42.091 [2024-12-02 15:22:27.691343] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:42.091 [2024-12-02 15:22:27.691464] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x75 00:06:42.092 [2024-12-02 15:22:27.691573] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x1 00:06:42.092 [2024-12-02 15:22:27.691895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.691922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.691978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.691993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.692048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.692062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.692117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.692131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.092 #36 NEW cov: 11889 ft: 14949 corp: 27/650b lim: 30 exec/s: 36 rss: 70Mb L: 28/30 MS: 1 ShuffleBytes- 00:06:42.092 [2024-12-02 15:22:27.731424] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffbf 00:06:42.092 [2024-12-02 15:22:27.731543] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.731655] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.731896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.731923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.731980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.731995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.732050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.732064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.092 #37 NEW cov: 11889 ft: 14953 corp: 28/672b lim: 30 exec/s: 37 rss: 70Mb L: 22/30 MS: 1 ShuffleBytes- 00:06:42.092 [2024-12-02 15:22:27.771542] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffbf 00:06:42.092 [2024-12-02 15:22:27.771664] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.771812] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.772027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.772054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.772111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.772126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.772182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.772199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.092 #38 NEW cov: 11889 ft: 14984 corp: 29/695b lim: 30 exec/s: 38 rss: 70Mb L: 23/30 MS: 1 InsertByte- 00:06:42.092 [2024-12-02 15:22:27.811688] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffbf 00:06:42.092 [2024-12-02 15:22:27.811812] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.811923] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.812150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.812177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.812232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.812247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.812302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.812317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.092 #39 NEW cov: 11889 ft: 15054 corp: 30/718b lim: 30 exec/s: 39 rss: 70Mb L: 23/30 MS: 1 ChangeByte- 00:06:42.092 [2024-12-02 15:22:27.851868] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.851987] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.852098] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.852209] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.092 [2024-12-02 15:22:27.852436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.852462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.852520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.852534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.852590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.852604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.852659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.852673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.092 #40 NEW cov: 11889 ft: 15055 corp: 31/745b lim: 30 exec/s: 40 rss: 70Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:06:42.092 [2024-12-02 15:22:27.891951] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:42.092 [2024-12-02 15:22:27.892173] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:06:42.092 [2024-12-02 15:22:27.892503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.892530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.892588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.892603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.892659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.892673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.092 [2024-12-02 15:22:27.892729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.092 [2024-12-02 15:22:27.892749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.350 #41 NEW cov: 11889 ft: 15061 corp: 32/773b lim: 30 exec/s: 41 rss: 70Mb L: 28/30 MS: 1 ChangeByte- 00:06:42.350 [2024-12-02 15:22:27.931940] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000fff7 00:06:42.350 [2024-12-02 15:22:27.932164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:27.932191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.350 #42 NEW cov: 11889 ft: 15081 corp: 33/784b lim: 30 exec/s: 42 rss: 70Mb L: 11/30 MS: 1 ChangeBit- 00:06:42.350 [2024-12-02 15:22:27.972161] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:27.972278] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:27.972387] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:27.972603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a5d83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:27.972629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:27.972685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:27.972699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:27.972749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:27.972763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.350 #43 NEW cov: 11889 ft: 15092 corp: 34/803b lim: 30 exec/s: 43 rss: 70Mb L: 19/30 MS: 1 EraseBytes- 00:06:42.350 [2024-12-02 15:22:28.012340] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:42.350 [2024-12-02 15:22:28.012976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.013002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.013060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.013078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.013137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00750000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.013151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.013205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.013219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.013275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.013289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:42.350 #44 NEW cov: 11889 ft: 15136 corp: 35/833b lim: 30 exec/s: 44 rss: 70Mb L: 30/30 MS: 1 CopyPart- 00:06:42.350 [2024-12-02 15:22:28.052420] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000242d 00:06:42.350 [2024-12-02 15:22:28.052539] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:28.052647] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:28.052761] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:28.052975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.053001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.053059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff2483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.053074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.053130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.053144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.350 [2024-12-02 15:22:28.053199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.350 [2024-12-02 15:22:28.053213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.350 #45 NEW cov: 11889 ft: 15144 corp: 36/862b lim: 30 exec/s: 45 rss: 70Mb L: 29/30 MS: 1 ChangeByte- 00:06:42.350 [2024-12-02 15:22:28.092486] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:28.092603] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:28.092719] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.350 [2024-12-02 15:22:28.092942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.351 [2024-12-02 15:22:28.092969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.351 [2024-12-02 15:22:28.093026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.351 [2024-12-02 15:22:28.093044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.351 [2024-12-02 15:22:28.093101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.351 [2024-12-02 15:22:28.093115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.351 #46 NEW cov: 11889 ft: 15163 corp: 37/884b lim: 30 exec/s: 46 rss: 70Mb L: 22/30 MS: 1 CopyPart- 00:06:42.351 [2024-12-02 15:22:28.132651] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffbf 00:06:42.351 [2024-12-02 15:22:28.132777] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.351 [2024-12-02 15:22:28.132888] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.351 [2024-12-02 15:22:28.133114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.351 [2024-12-02 15:22:28.133141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.351 [2024-12-02 15:22:28.133201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.351 [2024-12-02 15:22:28.133216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.351 [2024-12-02 15:22:28.133275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.351 [2024-12-02 15:22:28.133290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.351 #47 NEW cov: 11889 ft: 15167 corp: 38/906b lim: 30 exec/s: 47 rss: 70Mb L: 22/30 MS: 1 ChangeByte- 00:06:42.609 [2024-12-02 15:22:28.172791] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.172913] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.173025] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.173251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.173278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.173336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.173351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.173408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.173423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.609 #48 NEW cov: 11889 ft: 15200 corp: 39/928b lim: 30 exec/s: 48 rss: 70Mb L: 22/30 MS: 1 ChangeBit- 00:06:42.609 [2024-12-02 15:22:28.202831] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.202949] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.203062] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.203284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:fffb83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.203316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.203378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.203396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.203459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.203476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.609 #49 NEW cov: 11889 ft: 15227 corp: 40/950b lim: 30 exec/s: 49 rss: 70Mb L: 22/30 MS: 1 ChangeBinInt- 00:06:42.609 [2024-12-02 15:22:28.243058] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:42.609 [2024-12-02 15:22:28.243282] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (119812) > buf size (4096) 00:06:42.609 [2024-12-02 15:22:28.243607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.243634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.243690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.243705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.243769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:75000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.243783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.243838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.243853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.609 #50 NEW cov: 11889 ft: 15228 corp: 41/979b lim: 30 exec/s: 50 rss: 70Mb L: 29/30 MS: 1 InsertByte- 00:06:42.609 [2024-12-02 15:22:28.283101] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffbf 00:06:42.609 [2024-12-02 15:22:28.283219] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000fffe 00:06:42.609 [2024-12-02 15:22:28.283334] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.609 [2024-12-02 15:22:28.283575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.283603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.609 [2024-12-02 15:22:28.283661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.609 [2024-12-02 15:22:28.283675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.283731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:32ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.283751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.610 #51 NEW cov: 11889 ft: 15233 corp: 42/1002b lim: 30 exec/s: 51 rss: 70Mb L: 23/30 MS: 1 ChangeBit- 00:06:42.610 [2024-12-02 15:22:28.323217] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x3000024ff 00:06:42.610 [2024-12-02 15:22:28.323339] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.610 [2024-12-02 15:22:28.323453] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff7f 00:06:42.610 [2024-12-02 15:22:28.323561] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:06:42.610 [2024-12-02 15:22:28.323794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.323821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.323878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff2483ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.323892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.323949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff8323 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.323963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.324018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.324032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.610 #52 NEW cov: 11889 ft: 15239 corp: 43/1031b lim: 30 exec/s: 52 rss: 70Mb L: 29/30 MS: 1 ChangeBit- 00:06:42.610 [2024-12-02 15:22:28.363367] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:06:42.610 [2024-12-02 15:22:28.363892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.363918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.363974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.363990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.364047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.364061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:42.610 [2024-12-02 15:22:28.364118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0000001c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.610 [2024-12-02 15:22:28.364132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:42.610 #53 NEW cov: 11889 ft: 15246 corp: 44/1059b lim: 30 exec/s: 26 rss: 70Mb L: 28/30 MS: 1 ChangeBinInt- 00:06:42.610 #53 DONE cov: 11889 ft: 15246 corp: 44/1059b lim: 30 exec/s: 26 rss: 70Mb 00:06:42.610 Done 53 runs in 2 second(s) 00:06:42.869 15:22:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:06:42.869 15:22:28 -- ../common.sh@72 -- # (( i++ )) 00:06:42.869 15:22:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:42.869 15:22:28 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:06:42.869 15:22:28 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:06:42.869 15:22:28 -- nvmf/run.sh@24 -- # local timen=1 00:06:42.869 15:22:28 -- nvmf/run.sh@25 -- # local core=0x1 00:06:42.869 15:22:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:42.869 15:22:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:06:42.869 15:22:28 -- nvmf/run.sh@29 -- # printf %02d 2 00:06:42.869 15:22:28 -- nvmf/run.sh@29 -- # port=4402 00:06:42.869 15:22:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:42.869 15:22:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:06:42.869 15:22:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:42.869 15:22:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:06:42.869 [2024-12-02 15:22:28.547360] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:42.869 [2024-12-02 15:22:28.547437] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1596127 ] 00:06:42.869 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.127 [2024-12-02 15:22:28.798135] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.127 [2024-12-02 15:22:28.888937] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:43.127 [2024-12-02 15:22:28.889059] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.384 [2024-12-02 15:22:28.946687] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:43.384 [2024-12-02 15:22:28.963005] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:06:43.384 INFO: Running with entropic power schedule (0xFF, 100). 00:06:43.384 INFO: Seed: 2023768920 00:06:43.384 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:43.384 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:43.384 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:43.384 INFO: A corpus is not provided, starting from an empty corpus 00:06:43.384 #2 INITED exec/s: 0 rss: 60Mb 00:06:43.384 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:43.384 This may also happen if the target rejected all inputs we tried so far 00:06:43.384 [2024-12-02 15:22:29.007991] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.384 [2024-12-02 15:22:29.008110] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.384 [2024-12-02 15:22:29.008326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.384 [2024-12-02 15:22:29.008358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.384 [2024-12-02 15:22:29.008417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.384 [2024-12-02 15:22:29.008433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.642 NEW_FUNC[1/670]: 0x43db78 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:06:43.642 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:43.642 #5 NEW cov: 11579 ft: 11578 corp: 2/16b lim: 35 exec/s: 0 rss: 68Mb L: 15/15 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:06:43.642 [2024-12-02 15:22:29.308852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.642 [2024-12-02 15:22:29.308890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.642 #15 NEW cov: 11702 ft: 12428 corp: 3/26b lim: 35 exec/s: 0 rss: 69Mb L: 10/15 MS: 5 InsertByte-InsertByte-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:06:43.642 [2024-12-02 15:22:29.348725] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.642 [2024-12-02 15:22:29.348851] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.642 [2024-12-02 15:22:29.349060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.642 [2024-12-02 15:22:29.349089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.642 [2024-12-02 15:22:29.349143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.642 [2024-12-02 15:22:29.349159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.642 #16 NEW cov: 11708 ft: 12674 corp: 4/41b lim: 35 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ChangeBinInt- 00:06:43.642 [2024-12-02 15:22:29.388819] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.642 [2024-12-02 15:22:29.388931] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.642 [2024-12-02 15:22:29.389125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.642 [2024-12-02 15:22:29.389153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.642 [2024-12-02 15:22:29.389207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.642 [2024-12-02 15:22:29.389223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.642 #17 NEW cov: 11793 ft: 12922 corp: 5/56b lim: 35 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ChangeBit- 00:06:43.642 [2024-12-02 15:22:29.429115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:98980098 cdw11:98009898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.642 [2024-12-02 15:22:29.429141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.642 #18 NEW cov: 11793 ft: 12967 corp: 6/67b lim: 35 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 InsertRepeatedBytes- 00:06:43.643 [2024-12-02 15:22:29.459211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.643 [2024-12-02 15:22:29.459237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 #19 NEW cov: 11793 ft: 13041 corp: 7/80b lim: 35 exec/s: 0 rss: 69Mb L: 13/15 MS: 1 InsertRepeatedBytes- 00:06:43.901 [2024-12-02 15:22:29.499333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.499359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 #20 NEW cov: 11793 ft: 13108 corp: 8/91b lim: 35 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 InsertByte- 00:06:43.901 [2024-12-02 15:22:29.539420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b5007eb5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.539446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 #21 NEW cov: 11793 ft: 13159 corp: 9/102b lim: 35 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 InsertByte- 00:06:43.901 [2024-12-02 15:22:29.569502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002a cdw11:0000b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.569528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 #22 NEW cov: 11793 ft: 13275 corp: 10/113b lim: 35 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 CMP- DE: "\000\000\000\002"- 00:06:43.901 [2024-12-02 15:22:29.609484] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.901 [2024-12-02 15:22:29.609599] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.901 [2024-12-02 15:22:29.609817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.609844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 [2024-12-02 15:22:29.609897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.609914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:43.901 #23 NEW cov: 11793 ft: 13318 corp: 11/128b lim: 35 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ShuffleBytes- 00:06:43.901 [2024-12-02 15:22:29.649571] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:43.901 [2024-12-02 15:22:29.649784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.649810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 #24 NEW cov: 11793 ft: 13425 corp: 12/139b lim: 35 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 EraseBytes- 00:06:43.901 [2024-12-02 15:22:29.689908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b5007eb5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:43.901 [2024-12-02 15:22:29.689935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:43.901 #25 NEW cov: 11793 ft: 13468 corp: 13/151b lim: 35 exec/s: 0 rss: 69Mb L: 12/15 MS: 1 InsertByte- 00:06:44.160 [2024-12-02 15:22:29.729997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.730022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.160 #26 NEW cov: 11793 ft: 13545 corp: 14/162b lim: 35 exec/s: 0 rss: 69Mb L: 11/15 MS: 1 CrossOver- 00:06:44.160 [2024-12-02 15:22:29.769937] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.770052] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.770251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.770279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.770333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.770349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.160 #27 NEW cov: 11793 ft: 13623 corp: 15/177b lim: 35 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 ShuffleBytes- 00:06:44.160 [2024-12-02 15:22:29.810022] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.810225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:000b0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.810252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.160 #28 NEW cov: 11793 ft: 13651 corp: 16/188b lim: 35 exec/s: 0 rss: 70Mb L: 11/15 MS: 1 ChangeBinInt- 00:06:44.160 [2024-12-02 15:22:29.850161] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.850277] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.850478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.850504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.850574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:fdff0000 cdw11:00000f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.850590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.160 #29 NEW cov: 11793 ft: 13662 corp: 17/203b lim: 35 exec/s: 0 rss: 70Mb L: 15/15 MS: 1 ChangeBinInt- 00:06:44.160 [2024-12-02 15:22:29.890797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.890822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.890877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00b5 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.890891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.890945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.890959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.891011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.891025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:44.160 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:44.160 #30 NEW cov: 11816 ft: 14285 corp: 18/235b lim: 35 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:06:44.160 [2024-12-02 15:22:29.940454] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.940571] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.160 [2024-12-02 15:22:29.940882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.940910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.940965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.940984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.160 [2024-12-02 15:22:29.941036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:a8a800a8 cdw11:a800a8a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.160 [2024-12-02 15:22:29.941050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.160 #31 NEW cov: 11816 ft: 14457 corp: 19/257b lim: 35 exec/s: 0 rss: 70Mb L: 22/32 MS: 1 InsertRepeatedBytes- 00:06:44.418 [2024-12-02 15:22:29.980531] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:29.980747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:29.980773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.418 #32 NEW cov: 11816 ft: 14461 corp: 20/268b lim: 35 exec/s: 32 rss: 70Mb L: 11/32 MS: 1 ChangeBinInt- 00:06:44.418 [2024-12-02 15:22:30.020665] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.020789] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.020998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.021027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.418 [2024-12-02 15:22:30.021084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.021100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.418 #33 NEW cov: 11816 ft: 14481 corp: 21/283b lim: 35 exec/s: 33 rss: 70Mb L: 15/32 MS: 1 ShuffleBytes- 00:06:44.418 [2024-12-02 15:22:30.060774] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.060891] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.061103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.061132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.418 [2024-12-02 15:22:30.061187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:a800a8a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.061204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.418 #34 NEW cov: 11816 ft: 14499 corp: 22/301b lim: 35 exec/s: 34 rss: 70Mb L: 18/32 MS: 1 EraseBytes- 00:06:44.418 [2024-12-02 15:22:30.100898] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.101216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.101244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.418 [2024-12-02 15:22:30.101299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:000000f7 cdw11:00000f00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.101315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.418 #35 NEW cov: 11816 ft: 14503 corp: 23/316b lim: 35 exec/s: 35 rss: 70Mb L: 15/32 MS: 1 ChangeBinInt- 00:06:44.418 [2024-12-02 15:22:30.141160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.141187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.418 #36 NEW cov: 11816 ft: 14549 corp: 24/329b lim: 35 exec/s: 36 rss: 70Mb L: 13/32 MS: 1 ChangeByte- 00:06:44.418 [2024-12-02 15:22:30.181057] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.181263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.418 [2024-12-02 15:22:30.181301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.418 #37 NEW cov: 11816 ft: 14554 corp: 25/338b lim: 35 exec/s: 37 rss: 70Mb L: 9/32 MS: 1 EraseBytes- 00:06:44.418 [2024-12-02 15:22:30.221294] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.418 [2024-12-02 15:22:30.221507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002a cdw11:0000b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.419 [2024-12-02 15:22:30.221532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.419 [2024-12-02 15:22:30.221587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:02b50000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.419 [2024-12-02 15:22:30.221603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.677 #38 NEW cov: 11816 ft: 14580 corp: 26/357b lim: 35 exec/s: 38 rss: 70Mb L: 19/32 MS: 1 InsertRepeatedBytes- 00:06:44.677 [2024-12-02 15:22:30.261483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002e cdw11:0000b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.261525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.677 #39 NEW cov: 11816 ft: 14651 corp: 27/368b lim: 35 exec/s: 39 rss: 70Mb L: 11/32 MS: 1 ChangeBit- 00:06:44.677 [2024-12-02 15:22:30.301484] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.677 [2024-12-02 15:22:30.301601] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.677 [2024-12-02 15:22:30.301812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.301839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.677 [2024-12-02 15:22:30.301894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.301909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.677 #40 NEW cov: 11816 ft: 14668 corp: 28/383b lim: 35 exec/s: 40 rss: 70Mb L: 15/32 MS: 1 ChangeByte- 00:06:44.677 [2024-12-02 15:22:30.341752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.341778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.677 #41 NEW cov: 11816 ft: 14727 corp: 29/393b lim: 35 exec/s: 41 rss: 70Mb L: 10/32 MS: 1 ShuffleBytes- 00:06:44.677 [2024-12-02 15:22:30.382142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:98980098 cdw11:98009898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.382172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.677 [2024-12-02 15:22:30.382229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:98980098 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.382243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.677 [2024-12-02 15:22:30.382299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.382314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.677 #42 NEW cov: 11816 ft: 14779 corp: 30/420b lim: 35 exec/s: 42 rss: 70Mb L: 27/32 MS: 1 InsertRepeatedBytes- 00:06:44.677 [2024-12-02 15:22:30.421801] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.677 [2024-12-02 15:22:30.422013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0f000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.422040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.677 #43 NEW cov: 11816 ft: 14781 corp: 31/430b lim: 35 exec/s: 43 rss: 70Mb L: 10/32 MS: 1 EraseBytes- 00:06:44.677 [2024-12-02 15:22:30.461940] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.677 [2024-12-02 15:22:30.462147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:80000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.677 [2024-12-02 15:22:30.462171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.677 #44 NEW cov: 11816 ft: 14784 corp: 32/439b lim: 35 exec/s: 44 rss: 70Mb L: 9/32 MS: 1 ChangeBit- 00:06:44.935 [2024-12-02 15:22:30.502100] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.935 [2024-12-02 15:22:30.502396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:bd00022a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.935 [2024-12-02 15:22:30.502424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.935 [2024-12-02 15:22:30.502481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500b5 cdw11:b500b53d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.935 [2024-12-02 15:22:30.502495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.935 #45 NEW cov: 11816 ft: 14808 corp: 33/456b lim: 35 exec/s: 45 rss: 70Mb L: 17/32 MS: 1 PersAutoDict- DE: "\000\000\000\002"- 00:06:44.935 [2024-12-02 15:22:30.542190] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.935 [2024-12-02 15:22:30.542308] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.936 [2024-12-02 15:22:30.542509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.542537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.542591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:a800a8a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.542607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.936 #46 NEW cov: 11816 ft: 14874 corp: 34/475b lim: 35 exec/s: 46 rss: 70Mb L: 19/32 MS: 1 InsertByte- 00:06:44.936 [2024-12-02 15:22:30.582348] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:44.936 [2024-12-02 15:22:30.582744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:7b00027b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.582770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.582825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7b7b007b cdw11:7b007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.582840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.582893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:b5b500bd cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.582907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.936 #47 NEW cov: 11816 ft: 14894 corp: 35/501b lim: 35 exec/s: 47 rss: 70Mb L: 26/32 MS: 1 InsertRepeatedBytes- 00:06:44.936 [2024-12-02 15:22:30.622576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:98980098 cdw11:98009898 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.622601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.936 #48 NEW cov: 11816 ft: 14905 corp: 36/512b lim: 35 exec/s: 48 rss: 70Mb L: 11/32 MS: 1 ChangeBit- 00:06:44.936 [2024-12-02 15:22:30.662660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.662687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.936 #49 NEW cov: 11816 ft: 14949 corp: 37/525b lim: 35 exec/s: 49 rss: 70Mb L: 13/32 MS: 1 ChangeBit- 00:06:44.936 [2024-12-02 15:22:30.703239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ebd002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.703265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.703320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00b5 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.703334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.703387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.703402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.703455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.703468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:44.936 #50 NEW cov: 11816 ft: 14974 corp: 38/557b lim: 35 exec/s: 50 rss: 70Mb L: 32/32 MS: 1 ChangeByte- 00:06:44.936 [2024-12-02 15:22:30.753492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4cbd002a cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.753518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.753571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00b5 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.753588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.753642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.753656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.753707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.753721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:44.936 [2024-12-02 15:22:30.753779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:ffff00ff cdw11:b500ff02 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:44.936 [2024-12-02 15:22:30.753793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:45.194 #51 NEW cov: 11816 ft: 15029 corp: 39/592b lim: 35 exec/s: 51 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:06:45.194 [2024-12-02 15:22:30.793071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:bdb5002a cdw11:b5002ab5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.793097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.194 #52 NEW cov: 11816 ft: 15044 corp: 40/604b lim: 35 exec/s: 52 rss: 70Mb L: 12/35 MS: 1 ChangeByte- 00:06:45.194 [2024-12-02 15:22:30.833047] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.833346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7b7b0000 cdw11:7b007b7b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.833374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.194 [2024-12-02 15:22:30.833428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:b5b500bd cdw11:b500b5b5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.833442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.194 #53 NEW cov: 11816 ft: 15056 corp: 41/623b lim: 35 exec/s: 53 rss: 70Mb L: 19/35 MS: 1 EraseBytes- 00:06:45.194 [2024-12-02 15:22:30.873192] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.873318] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.873524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.873552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.194 [2024-12-02 15:22:30.873606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000a8a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.873622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.194 #54 NEW cov: 11816 ft: 15058 corp: 42/642b lim: 35 exec/s: 54 rss: 70Mb L: 19/35 MS: 1 PersAutoDict- DE: "\000\000\000\002"- 00:06:45.194 [2024-12-02 15:22:30.913340] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.913463] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.913670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.913697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.194 [2024-12-02 15:22:30.913755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.913771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.194 #55 NEW cov: 11816 ft: 15067 corp: 43/657b lim: 35 exec/s: 55 rss: 70Mb L: 15/35 MS: 1 ChangeBit- 00:06:45.194 [2024-12-02 15:22:30.953377] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.953668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:b8330000 cdw11:ac0006ab SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.953694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.194 [2024-12-02 15:22:30.953751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000096 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.953767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.194 #56 NEW cov: 11816 ft: 15071 corp: 44/672b lim: 35 exec/s: 56 rss: 70Mb L: 15/35 MS: 1 CMP- DE: "\2703\006\253\254K\226\000"- 00:06:45.194 [2024-12-02 15:22:30.993529] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.993639] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.993750] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:45.194 [2024-12-02 15:22:30.993945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.993974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.194 [2024-12-02 15:22:30.994031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.194 [2024-12-02 15:22:30.994057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.195 [2024-12-02 15:22:30.994113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff0000fd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:45.195 [2024-12-02 15:22:30.994128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:45.453 #57 NEW cov: 11816 ft: 15111 corp: 45/697b lim: 35 exec/s: 28 rss: 70Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:06:45.453 #57 DONE cov: 11816 ft: 15111 corp: 45/697b lim: 35 exec/s: 28 rss: 70Mb 00:06:45.453 ###### Recommended dictionary. ###### 00:06:45.453 "\000\000\000\002" # Uses: 2 00:06:45.453 "\2703\006\253\254K\226\000" # Uses: 0 00:06:45.453 ###### End of recommended dictionary. ###### 00:06:45.453 Done 57 runs in 2 second(s) 00:06:45.453 15:22:31 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:06:45.453 15:22:31 -- ../common.sh@72 -- # (( i++ )) 00:06:45.453 15:22:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:45.453 15:22:31 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:06:45.453 15:22:31 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:06:45.453 15:22:31 -- nvmf/run.sh@24 -- # local timen=1 00:06:45.453 15:22:31 -- nvmf/run.sh@25 -- # local core=0x1 00:06:45.453 15:22:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:45.453 15:22:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:06:45.453 15:22:31 -- nvmf/run.sh@29 -- # printf %02d 3 00:06:45.453 15:22:31 -- nvmf/run.sh@29 -- # port=4403 00:06:45.453 15:22:31 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:45.453 15:22:31 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:06:45.453 15:22:31 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:45.453 15:22:31 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:06:45.453 [2024-12-02 15:22:31.184525] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:45.453 [2024-12-02 15:22:31.184590] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1596475 ] 00:06:45.453 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.712 [2024-12-02 15:22:31.430579] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.712 [2024-12-02 15:22:31.513108] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:45.712 [2024-12-02 15:22:31.513237] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.971 [2024-12-02 15:22:31.571379] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:45.971 [2024-12-02 15:22:31.587699] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:06:45.971 INFO: Running with entropic power schedule (0xFF, 100). 00:06:45.971 INFO: Seed: 351805355 00:06:45.971 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:45.971 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:45.971 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:45.971 INFO: A corpus is not provided, starting from an empty corpus 00:06:45.971 #2 INITED exec/s: 0 rss: 60Mb 00:06:45.971 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:45.971 This may also happen if the target rejected all inputs we tried so far 00:06:46.230 NEW_FUNC[1/658]: 0x43f858 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:06:46.230 NEW_FUNC[2/658]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:46.230 #3 NEW cov: 11488 ft: 11494 corp: 2/13b lim: 20 exec/s: 0 rss: 68Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:06:46.230 NEW_FUNC[1/1]: 0x1c310f8 in accel_comp_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel_sw.c:554 00:06:46.230 #4 NEW cov: 11622 ft: 12384 corp: 3/33b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CMP- DE: "\3017\352\220\250K\226\000"- 00:06:46.488 #5 NEW cov: 11628 ft: 12692 corp: 4/53b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CopyPart- 00:06:46.488 #6 NEW cov: 11713 ft: 13294 corp: 5/58b lim: 20 exec/s: 0 rss: 68Mb L: 5/20 MS: 1 InsertRepeatedBytes- 00:06:46.488 #7 NEW cov: 11713 ft: 13495 corp: 6/78b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBinInt- 00:06:46.488 #9 NEW cov: 11714 ft: 13740 corp: 7/88b lim: 20 exec/s: 0 rss: 68Mb L: 10/20 MS: 2 CrossOver-InsertRepeatedBytes- 00:06:46.488 #10 NEW cov: 11714 ft: 13951 corp: 8/93b lim: 20 exec/s: 0 rss: 68Mb L: 5/20 MS: 1 ShuffleBytes- 00:06:46.488 #11 NEW cov: 11714 ft: 13999 corp: 9/113b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CMP- DE: "\015J\356\343\250K\226\000"- 00:06:46.488 #12 NEW cov: 11715 ft: 14041 corp: 10/130b lim: 20 exec/s: 0 rss: 68Mb L: 17/20 MS: 1 EraseBytes- 00:06:46.747 #13 NEW cov: 11715 ft: 14091 corp: 11/145b lim: 20 exec/s: 0 rss: 69Mb L: 15/20 MS: 1 CrossOver- 00:06:46.747 #14 NEW cov: 11715 ft: 14129 corp: 12/165b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 ChangeBinInt- 00:06:46.747 #15 NEW cov: 11715 ft: 14139 corp: 13/171b lim: 20 exec/s: 0 rss: 69Mb L: 6/20 MS: 1 InsertByte- 00:06:46.747 #16 NEW cov: 11715 ft: 14195 corp: 14/185b lim: 20 exec/s: 0 rss: 69Mb L: 14/20 MS: 1 EraseBytes- 00:06:46.747 #17 NEW cov: 11715 ft: 14228 corp: 15/191b lim: 20 exec/s: 0 rss: 69Mb L: 6/20 MS: 1 InsertByte- 00:06:46.747 NEW_FUNC[1/5]: 0x111e188 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:06:46.747 NEW_FUNC[2/5]: 0x111ed08 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:06:46.747 #18 NEW cov: 11822 ft: 14372 corp: 16/211b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:06:47.006 #19 NEW cov: 11822 ft: 14408 corp: 17/229b lim: 20 exec/s: 0 rss: 69Mb L: 18/20 MS: 1 InsertByte- 00:06:47.006 #21 NEW cov: 11822 ft: 14426 corp: 18/238b lim: 20 exec/s: 21 rss: 69Mb L: 9/20 MS: 2 ChangeByte-PersAutoDict- DE: "\3017\352\220\250K\226\000"- 00:06:47.006 #22 NEW cov: 11822 ft: 14492 corp: 19/253b lim: 20 exec/s: 22 rss: 69Mb L: 15/20 MS: 1 InsertRepeatedBytes- 00:06:47.006 #23 NEW cov: 11822 ft: 14502 corp: 20/265b lim: 20 exec/s: 23 rss: 69Mb L: 12/20 MS: 1 ShuffleBytes- 00:06:47.006 #24 NEW cov: 11822 ft: 14511 corp: 21/285b lim: 20 exec/s: 24 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:06:47.006 #25 NEW cov: 11822 ft: 14529 corp: 22/304b lim: 20 exec/s: 25 rss: 69Mb L: 19/20 MS: 1 InsertByte- 00:06:47.266 #26 NEW cov: 11822 ft: 14542 corp: 23/323b lim: 20 exec/s: 26 rss: 69Mb L: 19/20 MS: 1 ShuffleBytes- 00:06:47.266 #27 NEW cov: 11822 ft: 14600 corp: 24/343b lim: 20 exec/s: 27 rss: 69Mb L: 20/20 MS: 1 PersAutoDict- DE: "\3017\352\220\250K\226\000"- 00:06:47.266 #28 NEW cov: 11822 ft: 14607 corp: 25/352b lim: 20 exec/s: 28 rss: 69Mb L: 9/20 MS: 1 CrossOver- 00:06:47.266 #29 NEW cov: 11822 ft: 14646 corp: 26/372b lim: 20 exec/s: 29 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:06:47.266 #30 NEW cov: 11822 ft: 14647 corp: 27/384b lim: 20 exec/s: 30 rss: 69Mb L: 12/20 MS: 1 CrossOver- 00:06:47.266 [2024-12-02 15:22:33.031489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:47.266 [2024-12-02 15:22:33.031535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.266 NEW_FUNC[1/15]: 0x1537108 in nvme_ctrlr_process_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3091 00:06:47.266 NEW_FUNC[2/15]: 0x15cd498 in nvme_ctrlr_queue_async_event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:3131 00:06:47.266 #31 NEW cov: 12039 ft: 14886 corp: 28/399b lim: 20 exec/s: 31 rss: 69Mb L: 15/20 MS: 1 InsertRepeatedBytes- 00:06:47.524 #32 NEW cov: 12039 ft: 14941 corp: 29/417b lim: 20 exec/s: 32 rss: 69Mb L: 18/20 MS: 1 ShuffleBytes- 00:06:47.524 #33 NEW cov: 12039 ft: 14953 corp: 30/430b lim: 20 exec/s: 33 rss: 69Mb L: 13/20 MS: 1 EraseBytes- 00:06:47.524 #34 NEW cov: 12039 ft: 14972 corp: 31/443b lim: 20 exec/s: 34 rss: 69Mb L: 13/20 MS: 1 ChangeByte- 00:06:47.524 #35 NEW cov: 12039 ft: 15007 corp: 32/451b lim: 20 exec/s: 35 rss: 69Mb L: 8/20 MS: 1 InsertRepeatedBytes- 00:06:47.524 #36 NEW cov: 12039 ft: 15015 corp: 33/467b lim: 20 exec/s: 36 rss: 69Mb L: 16/20 MS: 1 InsertByte- 00:06:47.524 #37 NEW cov: 12039 ft: 15052 corp: 34/487b lim: 20 exec/s: 37 rss: 69Mb L: 20/20 MS: 1 CrossOver- 00:06:47.783 #38 NEW cov: 12039 ft: 15069 corp: 35/504b lim: 20 exec/s: 38 rss: 69Mb L: 17/20 MS: 1 CopyPart- 00:06:47.783 #39 NEW cov: 12039 ft: 15073 corp: 36/511b lim: 20 exec/s: 39 rss: 70Mb L: 7/20 MS: 1 CopyPart- 00:06:47.783 #43 NEW cov: 12039 ft: 15078 corp: 37/520b lim: 20 exec/s: 43 rss: 70Mb L: 9/20 MS: 4 ChangeByte-CrossOver-ChangeByte-PersAutoDict- DE: "\015J\356\343\250K\226\000"- 00:06:47.783 #44 NEW cov: 12039 ft: 15114 corp: 38/540b lim: 20 exec/s: 44 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:06:47.783 #45 NEW cov: 12039 ft: 15155 corp: 39/558b lim: 20 exec/s: 45 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:06:47.783 #46 NEW cov: 12039 ft: 15161 corp: 40/577b lim: 20 exec/s: 46 rss: 70Mb L: 19/20 MS: 1 ChangeBinInt- 00:06:47.783 [2024-12-02 15:22:33.583366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:06:47.783 [2024-12-02 15:22:33.583402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.042 #47 NEW cov: 12039 ft: 15204 corp: 41/594b lim: 20 exec/s: 47 rss: 70Mb L: 17/20 MS: 1 InsertRepeatedBytes- 00:06:48.042 #48 NEW cov: 12039 ft: 15210 corp: 42/604b lim: 20 exec/s: 24 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:06:48.043 #48 DONE cov: 12039 ft: 15210 corp: 42/604b lim: 20 exec/s: 24 rss: 70Mb 00:06:48.043 ###### Recommended dictionary. ###### 00:06:48.043 "\3017\352\220\250K\226\000" # Uses: 2 00:06:48.043 "\015J\356\343\250K\226\000" # Uses: 1 00:06:48.043 ###### End of recommended dictionary. ###### 00:06:48.043 Done 48 runs in 2 second(s) 00:06:48.043 15:22:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:06:48.043 15:22:33 -- ../common.sh@72 -- # (( i++ )) 00:06:48.043 15:22:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:48.043 15:22:33 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:06:48.043 15:22:33 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:06:48.043 15:22:33 -- nvmf/run.sh@24 -- # local timen=1 00:06:48.043 15:22:33 -- nvmf/run.sh@25 -- # local core=0x1 00:06:48.043 15:22:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:48.043 15:22:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:06:48.043 15:22:33 -- nvmf/run.sh@29 -- # printf %02d 4 00:06:48.043 15:22:33 -- nvmf/run.sh@29 -- # port=4404 00:06:48.043 15:22:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:48.043 15:22:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:06:48.043 15:22:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:48.043 15:22:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:06:48.043 [2024-12-02 15:22:33.815470] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:48.043 [2024-12-02 15:22:33.815548] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1597021 ] 00:06:48.043 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.301 [2024-12-02 15:22:34.062457] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.559 [2024-12-02 15:22:34.150552] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:48.559 [2024-12-02 15:22:34.150673] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.559 [2024-12-02 15:22:34.208475] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:48.559 [2024-12-02 15:22:34.224802] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:06:48.559 INFO: Running with entropic power schedule (0xFF, 100). 00:06:48.559 INFO: Seed: 2990816078 00:06:48.559 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:48.559 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:48.559 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:48.559 INFO: A corpus is not provided, starting from an empty corpus 00:06:48.559 #2 INITED exec/s: 0 rss: 60Mb 00:06:48.559 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:48.559 This may also happen if the target rejected all inputs we tried so far 00:06:48.559 [2024-12-02 15:22:34.280202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.559 [2024-12-02 15:22:34.280232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.559 [2024-12-02 15:22:34.280286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.559 [2024-12-02 15:22:34.280305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.817 NEW_FUNC[1/671]: 0x440958 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:06:48.817 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:48.817 #15 NEW cov: 11600 ft: 11596 corp: 2/16b lim: 35 exec/s: 0 rss: 68Mb L: 15/15 MS: 3 CopyPart-CopyPart-InsertRepeatedBytes- 00:06:48.817 [2024-12-02 15:22:34.600954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.817 [2024-12-02 15:22:34.601001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.817 #16 NEW cov: 11714 ft: 12911 corp: 3/23b lim: 35 exec/s: 0 rss: 68Mb L: 7/15 MS: 1 InsertRepeatedBytes- 00:06:49.076 [2024-12-02 15:22:34.651359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.651386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.651440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.651454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.651505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.651518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.651567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.651580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.076 #17 NEW cov: 11720 ft: 13530 corp: 4/52b lim: 35 exec/s: 0 rss: 68Mb L: 29/29 MS: 1 CopyPart- 00:06:49.076 [2024-12-02 15:22:34.691143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.691168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.691222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:242c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.691236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.076 #18 NEW cov: 11805 ft: 13791 corp: 5/67b lim: 35 exec/s: 0 rss: 68Mb L: 15/29 MS: 1 ChangeBinInt- 00:06:49.076 [2024-12-02 15:22:34.731575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.731600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.731653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.731668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.731724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.731742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.731793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.731807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.076 #19 NEW cov: 11805 ft: 13837 corp: 6/101b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CopyPart- 00:06:49.076 [2024-12-02 15:22:34.771352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.771377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.771431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.771446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.076 #20 NEW cov: 11805 ft: 14009 corp: 7/116b lim: 35 exec/s: 0 rss: 68Mb L: 15/34 MS: 1 ShuffleBytes- 00:06:49.076 [2024-12-02 15:22:34.811816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.811843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.811898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.076 [2024-12-02 15:22:34.811913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.076 [2024-12-02 15:22:34.811964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.077 [2024-12-02 15:22:34.811977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.077 [2024-12-02 15:22:34.812029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.077 [2024-12-02 15:22:34.812042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.077 #21 NEW cov: 11805 ft: 14052 corp: 8/146b lim: 35 exec/s: 0 rss: 68Mb L: 30/34 MS: 1 CopyPart- 00:06:49.077 [2024-12-02 15:22:34.851602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.077 [2024-12-02 15:22:34.851627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.077 [2024-12-02 15:22:34.851680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.077 [2024-12-02 15:22:34.851694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.077 #22 NEW cov: 11805 ft: 14109 corp: 9/161b lim: 35 exec/s: 0 rss: 68Mb L: 15/34 MS: 1 CrossOver- 00:06:49.077 [2024-12-02 15:22:34.891777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.077 [2024-12-02 15:22:34.891803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.077 [2024-12-02 15:22:34.891861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.077 [2024-12-02 15:22:34.891875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.336 #23 NEW cov: 11805 ft: 14142 corp: 10/176b lim: 35 exec/s: 0 rss: 68Mb L: 15/34 MS: 1 ChangeBit- 00:06:49.336 [2024-12-02 15:22:34.931814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:34.931839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.336 [2024-12-02 15:22:34.931892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:34.931905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.336 #24 NEW cov: 11805 ft: 14240 corp: 11/191b lim: 35 exec/s: 0 rss: 68Mb L: 15/34 MS: 1 ShuffleBytes- 00:06:49.336 [2024-12-02 15:22:34.971784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000031 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:34.971809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.336 #28 NEW cov: 11805 ft: 14274 corp: 12/198b lim: 35 exec/s: 0 rss: 68Mb L: 7/34 MS: 4 ChangeByte-InsertRepeatedBytes-ShuffleBytes-InsertByte- 00:06:49.336 [2024-12-02 15:22:35.012239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:35.012264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.336 [2024-12-02 15:22:35.012319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:35.012332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.336 [2024-12-02 15:22:35.012386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff2cffff cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:35.012400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.336 #29 NEW cov: 11805 ft: 14532 corp: 13/219b lim: 35 exec/s: 0 rss: 68Mb L: 21/34 MS: 1 CrossOver- 00:06:49.336 [2024-12-02 15:22:35.052475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:35.052499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.336 [2024-12-02 15:22:35.052553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c21 cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:35.052567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.336 [2024-12-02 15:22:35.052619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.336 [2024-12-02 15:22:35.052632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.337 [2024-12-02 15:22:35.052685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.337 [2024-12-02 15:22:35.052700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.337 #30 NEW cov: 11805 ft: 14574 corp: 14/250b lim: 35 exec/s: 0 rss: 68Mb L: 31/34 MS: 1 InsertByte- 00:06:49.337 [2024-12-02 15:22:35.092166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.337 [2024-12-02 15:22:35.092191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.337 #31 NEW cov: 11805 ft: 14672 corp: 15/258b lim: 35 exec/s: 0 rss: 68Mb L: 8/34 MS: 1 EraseBytes- 00:06:49.337 [2024-12-02 15:22:35.132485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:312c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.337 [2024-12-02 15:22:35.132509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.337 [2024-12-02 15:22:35.132562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.337 [2024-12-02 15:22:35.132575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.337 #32 NEW cov: 11805 ft: 14815 corp: 16/273b lim: 35 exec/s: 0 rss: 69Mb L: 15/34 MS: 1 ChangeBinInt- 00:06:49.605 [2024-12-02 15:22:35.172524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.172549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 [2024-12-02 15:22:35.172602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.172616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.605 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:49.605 #33 NEW cov: 11828 ft: 14919 corp: 17/290b lim: 35 exec/s: 0 rss: 69Mb L: 17/34 MS: 1 EraseBytes- 00:06:49.605 [2024-12-02 15:22:35.212496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.212521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 #34 NEW cov: 11828 ft: 14932 corp: 18/298b lim: 35 exec/s: 0 rss: 69Mb L: 8/34 MS: 1 CrossOver- 00:06:49.605 [2024-12-02 15:22:35.252644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.252668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 #37 NEW cov: 11828 ft: 14947 corp: 19/309b lim: 35 exec/s: 37 rss: 69Mb L: 11/34 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:06:49.605 [2024-12-02 15:22:35.292900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:312c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.292924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 [2024-12-02 15:22:35.292976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c32 cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.292990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.605 #38 NEW cov: 11828 ft: 14999 corp: 20/324b lim: 35 exec/s: 38 rss: 69Mb L: 15/34 MS: 1 ChangeByte- 00:06:49.605 [2024-12-02 15:22:35.333054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a2c cdw11:2c0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.333078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 [2024-12-02 15:22:35.333132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.333146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.605 #39 NEW cov: 11828 ft: 15016 corp: 21/339b lim: 35 exec/s: 39 rss: 69Mb L: 15/34 MS: 1 ShuffleBytes- 00:06:49.605 [2024-12-02 15:22:35.372949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.372974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 #40 NEW cov: 11828 ft: 15039 corp: 22/350b lim: 35 exec/s: 40 rss: 69Mb L: 11/34 MS: 1 ShuffleBytes- 00:06:49.605 [2024-12-02 15:22:35.413429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:312c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.413454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.605 [2024-12-02 15:22:35.413510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c32 cdw11:2c2c0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.413524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.605 [2024-12-02 15:22:35.413577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f6f6f6f6 cdw11:f6f60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.605 [2024-12-02 15:22:35.413591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.863 #41 NEW cov: 11828 ft: 15053 corp: 23/375b lim: 35 exec/s: 41 rss: 69Mb L: 25/34 MS: 1 InsertRepeatedBytes- 00:06:49.863 [2024-12-02 15:22:35.453651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.453675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.453730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c21 cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.453748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.453802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.453815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.453869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.453882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.863 #42 NEW cov: 11828 ft: 15065 corp: 24/406b lim: 35 exec/s: 42 rss: 69Mb L: 31/34 MS: 1 ShuffleBytes- 00:06:49.863 [2024-12-02 15:22:35.493616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a313a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.493644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.493698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:322c2c2c cdw11:242c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.493712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.493764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:f6f6f6f6 cdw11:f6f60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.493778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.863 #43 NEW cov: 11828 ft: 15076 corp: 25/432b lim: 35 exec/s: 43 rss: 69Mb L: 26/34 MS: 1 InsertByte- 00:06:49.863 [2024-12-02 15:22:35.533467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.533492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.863 #44 NEW cov: 11828 ft: 15085 corp: 26/439b lim: 35 exec/s: 44 rss: 69Mb L: 7/34 MS: 1 CopyPart- 00:06:49.863 [2024-12-02 15:22:35.573884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.573909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.573963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2f cdw11:240a0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.573976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.574030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.574043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.614062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.614088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.614140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2f2cc62c cdw11:2c240000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.614154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.863 [2024-12-02 15:22:35.614210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.614224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.863 #46 NEW cov: 11828 ft: 15098 corp: 27/462b lim: 35 exec/s: 46 rss: 69Mb L: 23/34 MS: 2 InsertByte-InsertByte- 00:06:49.863 [2024-12-02 15:22:35.653746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d3d3f3f5 cdw11:d3d30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.863 [2024-12-02 15:22:35.653772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.863 #47 NEW cov: 11828 ft: 15108 corp: 28/470b lim: 35 exec/s: 47 rss: 70Mb L: 8/34 MS: 1 ChangeBinInt- 00:06:50.121 [2024-12-02 15:22:35.694263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.694291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.694345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2f2cc62c cdw11:2c240000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.694362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.694414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.694427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.121 #48 NEW cov: 11828 ft: 15116 corp: 29/493b lim: 35 exec/s: 48 rss: 70Mb L: 23/34 MS: 1 ChangeBit- 00:06:50.121 [2024-12-02 15:22:35.734017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00310000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.734042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.774149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00310000 cdw11:00040000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.774173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.121 #50 NEW cov: 11828 ft: 15135 corp: 30/500b lim: 35 exec/s: 50 rss: 70Mb L: 7/34 MS: 2 ShuffleBytes-ChangeBit- 00:06:50.121 [2024-12-02 15:22:35.814744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.814770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.814825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.814839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.814892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff2cffff cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.814904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.814956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.814969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.121 #51 NEW cov: 11828 ft: 15144 corp: 31/531b lim: 35 exec/s: 51 rss: 70Mb L: 31/34 MS: 1 CopyPart- 00:06:50.121 [2024-12-02 15:22:35.854862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d4ed0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.854888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.854943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.121 [2024-12-02 15:22:35.854957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.121 [2024-12-02 15:22:35.855011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.855029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.122 [2024-12-02 15:22:35.855080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.855093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.122 #52 NEW cov: 11828 ft: 15175 corp: 32/565b lim: 35 exec/s: 52 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:06:50.122 [2024-12-02 15:22:35.894969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.894994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.122 [2024-12-02 15:22:35.895047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.895061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.122 [2024-12-02 15:22:35.895112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.895126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.122 [2024-12-02 15:22:35.895179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.895192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.122 #53 NEW cov: 11828 ft: 15180 corp: 33/595b lim: 35 exec/s: 53 rss: 70Mb L: 30/34 MS: 1 CrossOver- 00:06:50.122 [2024-12-02 15:22:35.934742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:312c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.934767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.122 [2024-12-02 15:22:35.934821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c32 cdw11:2ce00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.122 [2024-12-02 15:22:35.934834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.380 #54 NEW cov: 11828 ft: 15187 corp: 34/611b lim: 35 exec/s: 54 rss: 70Mb L: 16/34 MS: 1 InsertByte- 00:06:50.380 [2024-12-02 15:22:35.974851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:35.974876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:35.974931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:35.974944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.380 #55 NEW cov: 11828 ft: 15210 corp: 35/626b lim: 35 exec/s: 55 rss: 70Mb L: 15/34 MS: 1 ShuffleBytes- 00:06:50.380 [2024-12-02 15:22:36.005127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.005152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:36.005208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c21 cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.005222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:36.005274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.005288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.380 #56 NEW cov: 11828 ft: 15224 corp: 36/648b lim: 35 exec/s: 56 rss: 70Mb L: 22/34 MS: 1 EraseBytes- 00:06:50.380 [2024-12-02 15:22:36.045129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a2c cdw11:2c0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.045153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:36.045207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.045221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.380 #57 NEW cov: 11828 ft: 15232 corp: 37/663b lim: 35 exec/s: 57 rss: 70Mb L: 15/34 MS: 1 ChangeBit- 00:06:50.380 [2024-12-02 15:22:36.085030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000031 cdw11:00100000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.085055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.380 #58 NEW cov: 11828 ft: 15244 corp: 38/670b lim: 35 exec/s: 58 rss: 70Mb L: 7/34 MS: 1 ChangeBit- 00:06:50.380 [2024-12-02 15:22:36.125463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c2c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.125489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:36.125546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c242c2c cdw11:0aff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.125560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:36.125612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:ff2cffff cdw11:2e2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.125626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.380 #59 NEW cov: 11828 ft: 15249 corp: 39/691b lim: 35 exec/s: 59 rss: 70Mb L: 21/34 MS: 1 ChangeBit- 00:06:50.380 [2024-12-02 15:22:36.165375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:092c0a0a cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.165400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.380 [2024-12-02 15:22:36.165455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.380 [2024-12-02 15:22:36.165468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.380 #60 NEW cov: 11828 ft: 15265 corp: 40/707b lim: 35 exec/s: 60 rss: 70Mb L: 16/34 MS: 1 InsertByte- 00:06:50.638 [2024-12-02 15:22:36.205670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:2c0a0a0a cdw11:0a2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.638 [2024-12-02 15:22:36.205698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.638 [2024-12-02 15:22:36.205756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:2c2c2c21 cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.638 [2024-12-02 15:22:36.205771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.638 [2024-12-02 15:22:36.205823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:2c2c2c2c cdw11:2c2c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.638 [2024-12-02 15:22:36.205837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.638 #61 NEW cov: 11828 ft: 15279 corp: 41/734b lim: 35 exec/s: 61 rss: 70Mb L: 27/34 MS: 1 EraseBytes- 00:06:50.638 [2024-12-02 15:22:36.245473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d3d3f3f5 cdw11:d3d30003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:50.638 [2024-12-02 15:22:36.245497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.638 #62 NEW cov: 11828 ft: 15336 corp: 42/741b lim: 35 exec/s: 31 rss: 70Mb L: 7/34 MS: 1 EraseBytes- 00:06:50.638 #62 DONE cov: 11828 ft: 15336 corp: 42/741b lim: 35 exec/s: 31 rss: 70Mb 00:06:50.638 Done 62 runs in 2 second(s) 00:06:50.638 15:22:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:06:50.638 15:22:36 -- ../common.sh@72 -- # (( i++ )) 00:06:50.638 15:22:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:50.638 15:22:36 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:06:50.638 15:22:36 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:06:50.638 15:22:36 -- nvmf/run.sh@24 -- # local timen=1 00:06:50.638 15:22:36 -- nvmf/run.sh@25 -- # local core=0x1 00:06:50.639 15:22:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:50.639 15:22:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:06:50.639 15:22:36 -- nvmf/run.sh@29 -- # printf %02d 5 00:06:50.639 15:22:36 -- nvmf/run.sh@29 -- # port=4405 00:06:50.639 15:22:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:50.639 15:22:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:06:50.639 15:22:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:50.639 15:22:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:06:50.639 [2024-12-02 15:22:36.439009] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:50.639 [2024-12-02 15:22:36.439076] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1597558 ] 00:06:50.896 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.896 [2024-12-02 15:22:36.688945] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.155 [2024-12-02 15:22:36.777295] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:51.155 [2024-12-02 15:22:36.777436] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.155 [2024-12-02 15:22:36.835150] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:51.155 [2024-12-02 15:22:36.851477] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:06:51.155 INFO: Running with entropic power schedule (0xFF, 100). 00:06:51.155 INFO: Seed: 1321834379 00:06:51.155 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:51.155 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:51.155 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:51.155 INFO: A corpus is not provided, starting from an empty corpus 00:06:51.155 #2 INITED exec/s: 0 rss: 60Mb 00:06:51.155 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:51.155 This may also happen if the target rejected all inputs we tried so far 00:06:51.155 [2024-12-02 15:22:36.897054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.155 [2024-12-02 15:22:36.897081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.155 [2024-12-02 15:22:36.897134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.155 [2024-12-02 15:22:36.897148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.155 [2024-12-02 15:22:36.897201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.155 [2024-12-02 15:22:36.897214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.413 NEW_FUNC[1/671]: 0x442af8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:06:51.413 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:51.413 #14 NEW cov: 11612 ft: 11613 corp: 2/32b lim: 45 exec/s: 0 rss: 68Mb L: 31/31 MS: 2 CopyPart-InsertRepeatedBytes- 00:06:51.413 [2024-12-02 15:22:37.197612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.413 [2024-12-02 15:22:37.197656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.413 [2024-12-02 15:22:37.197713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.413 [2024-12-02 15:22:37.197727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.413 #15 NEW cov: 11725 ft: 12455 corp: 3/52b lim: 45 exec/s: 0 rss: 68Mb L: 20/31 MS: 1 EraseBytes- 00:06:51.673 [2024-12-02 15:22:37.247828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.247854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.247910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.247925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.247978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.247991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.673 #20 NEW cov: 11731 ft: 12712 corp: 4/80b lim: 45 exec/s: 0 rss: 68Mb L: 28/31 MS: 5 InsertByte-ChangeByte-CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:06:51.673 [2024-12-02 15:22:37.287917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff953c3c cdw11:4bb10001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.287946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.288000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c4e3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.288014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.288067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.288081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.673 #21 NEW cov: 11816 ft: 12987 corp: 5/108b lim: 45 exec/s: 0 rss: 68Mb L: 28/31 MS: 1 CMP- DE: "\377\225K\261?\242\321N"- 00:06:51.673 [2024-12-02 15:22:37.328032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.328057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.328110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.328124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.328178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c2b0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.328191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.673 #22 NEW cov: 11816 ft: 13121 corp: 6/136b lim: 45 exec/s: 0 rss: 68Mb L: 28/31 MS: 1 ChangeByte- 00:06:51.673 [2024-12-02 15:22:37.367988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.368014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.673 [2024-12-02 15:22:37.368068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5d000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.673 [2024-12-02 15:22:37.368082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.673 #23 NEW cov: 11816 ft: 13221 corp: 7/157b lim: 45 exec/s: 0 rss: 68Mb L: 21/31 MS: 1 InsertByte- 00:06:51.673 [2024-12-02 15:22:37.408094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.408120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.674 [2024-12-02 15:22:37.408174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.408188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.674 #24 NEW cov: 11816 ft: 13264 corp: 8/177b lim: 45 exec/s: 0 rss: 68Mb L: 20/31 MS: 1 ShuffleBytes- 00:06:51.674 [2024-12-02 15:22:37.448346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.448371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.674 [2024-12-02 15:22:37.448429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.448443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.674 [2024-12-02 15:22:37.448512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.448526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.674 #27 NEW cov: 11816 ft: 13302 corp: 9/204b lim: 45 exec/s: 0 rss: 68Mb L: 27/31 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:06:51.674 [2024-12-02 15:22:37.488454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15155b15 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.488479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.674 [2024-12-02 15:22:37.488533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.488547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.674 [2024-12-02 15:22:37.488602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.674 [2024-12-02 15:22:37.488616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.934 #28 NEW cov: 11816 ft: 13353 corp: 10/232b lim: 45 exec/s: 0 rss: 68Mb L: 28/31 MS: 1 InsertByte- 00:06:51.934 [2024-12-02 15:22:37.528434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.528459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.528513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:5d000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.528527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.934 #29 NEW cov: 11816 ft: 13448 corp: 11/253b lim: 45 exec/s: 0 rss: 68Mb L: 21/31 MS: 1 ShuffleBytes- 00:06:51.934 [2024-12-02 15:22:37.568709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.568738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.568794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.568807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.568861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:150f1515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.568874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.934 #30 NEW cov: 11816 ft: 13466 corp: 12/280b lim: 45 exec/s: 0 rss: 68Mb L: 27/31 MS: 1 ChangeBinInt- 00:06:51.934 [2024-12-02 15:22:37.608990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.609016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.609074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:954b3cff cdw11:b13f0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.609088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.609142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.609155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.609209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:3c3c3c3c cdw11:2b3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.609223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.934 #31 NEW cov: 11816 ft: 13852 corp: 13/316b lim: 45 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 PersAutoDict- DE: "\377\225K\261?\242\321N"- 00:06:51.934 [2024-12-02 15:22:37.648946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15955b15 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.648971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.649025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.649039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.649092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.649105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.934 #32 NEW cov: 11816 ft: 13883 corp: 14/344b lim: 45 exec/s: 0 rss: 69Mb L: 28/36 MS: 1 ChangeBit- 00:06:51.934 [2024-12-02 15:22:37.689221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4bb1ff95 cdw11:3fa20006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.689245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.689317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.689331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.689385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.689399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.689451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:3c3c3c3c cdw11:2b3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.689465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.934 #33 NEW cov: 11816 ft: 13907 corp: 15/380b lim: 45 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 PersAutoDict- DE: "\377\225K\261?\242\321N"- 00:06:51.934 [2024-12-02 15:22:37.729179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.729207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.729262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.729276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.934 [2024-12-02 15:22:37.729329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:150f1515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:51.934 [2024-12-02 15:22:37.729359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.194 #34 NEW cov: 11816 ft: 13910 corp: 16/407b lim: 45 exec/s: 0 rss: 69Mb L: 27/36 MS: 1 ShuffleBytes- 00:06:52.194 [2024-12-02 15:22:37.769290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.769315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.769370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.769384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.769436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00100000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.769450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.194 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:52.194 #35 NEW cov: 11839 ft: 13943 corp: 17/438b lim: 45 exec/s: 0 rss: 69Mb L: 31/36 MS: 1 ChangeBit- 00:06:52.194 [2024-12-02 15:22:37.809591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.809616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.809672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.809686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.809751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.809765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.809820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0b0b0b0b cdw11:0b0b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.809833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.194 #39 NEW cov: 11839 ft: 13959 corp: 18/482b lim: 45 exec/s: 0 rss: 69Mb L: 44/44 MS: 4 CopyPart-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:06:52.194 [2024-12-02 15:22:37.849358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.849383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.849439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.849453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 #40 NEW cov: 11839 ft: 13985 corp: 19/502b lim: 45 exec/s: 0 rss: 69Mb L: 20/44 MS: 1 EraseBytes- 00:06:52.194 [2024-12-02 15:22:37.889632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.889659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.889731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.889750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.889804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00100000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.889818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.194 #41 NEW cov: 11839 ft: 14070 corp: 20/533b lim: 45 exec/s: 41 rss: 69Mb L: 31/44 MS: 1 ChangeBit- 00:06:52.194 [2024-12-02 15:22:37.929749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.929775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.929832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.929845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.929899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.929912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.194 #42 NEW cov: 11839 ft: 14099 corp: 21/564b lim: 45 exec/s: 42 rss: 69Mb L: 31/44 MS: 1 ChangeBinInt- 00:06:52.194 [2024-12-02 15:22:37.969882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.969907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.969976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.969991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:37.970044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:37.970057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.194 #43 NEW cov: 11839 ft: 14163 corp: 22/595b lim: 45 exec/s: 43 rss: 69Mb L: 31/44 MS: 1 ShuffleBytes- 00:06:52.194 [2024-12-02 15:22:38.010025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:38.010050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:38.010108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:38.010122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.194 [2024-12-02 15:22:38.010175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c003c3c cdw11:023c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.194 [2024-12-02 15:22:38.010189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.454 #44 NEW cov: 11839 ft: 14184 corp: 23/625b lim: 45 exec/s: 44 rss: 69Mb L: 30/44 MS: 1 CMP- DE: "\000\002"- 00:06:52.454 [2024-12-02 15:22:38.050093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.050117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.050188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.050202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.050256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:150f1515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.050269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.454 #45 NEW cov: 11839 ft: 14195 corp: 24/652b lim: 45 exec/s: 45 rss: 69Mb L: 27/44 MS: 1 CopyPart- 00:06:52.454 [2024-12-02 15:22:38.090235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.090261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.090318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.090332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.090387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:150f1515 cdw11:150b0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.090401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.454 #46 NEW cov: 11839 ft: 14206 corp: 25/680b lim: 45 exec/s: 46 rss: 69Mb L: 28/44 MS: 1 InsertByte- 00:06:52.454 [2024-12-02 15:22:38.130315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.130340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.130412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.130426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.130482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:150f1515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.130496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.454 #47 NEW cov: 11839 ft: 14215 corp: 26/707b lim: 45 exec/s: 47 rss: 69Mb L: 27/44 MS: 1 ChangeBinInt- 00:06:52.454 [2024-12-02 15:22:38.170456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.170481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.170538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.170552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.454 [2024-12-02 15:22:38.170607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.454 [2024-12-02 15:22:38.170621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.455 #48 NEW cov: 11839 ft: 14217 corp: 27/736b lim: 45 exec/s: 48 rss: 69Mb L: 29/44 MS: 1 EraseBytes- 00:06:52.455 [2024-12-02 15:22:38.210275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15150015 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.455 [2024-12-02 15:22:38.210299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.455 #49 NEW cov: 11839 ft: 14943 corp: 28/753b lim: 45 exec/s: 49 rss: 69Mb L: 17/44 MS: 1 CrossOver- 00:06:52.455 [2024-12-02 15:22:38.250682] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.455 [2024-12-02 15:22:38.250707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.455 [2024-12-02 15:22:38.250778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.455 [2024-12-02 15:22:38.250793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.455 [2024-12-02 15:22:38.250848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00100000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.455 [2024-12-02 15:22:38.250862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.714 #50 NEW cov: 11839 ft: 14951 corp: 29/784b lim: 45 exec/s: 50 rss: 70Mb L: 31/44 MS: 1 ChangeByte- 00:06:52.714 [2024-12-02 15:22:38.290829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.290854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.290927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151b15 cdw11:150f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.290942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.290996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:150f1515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.291009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.714 #51 NEW cov: 11839 ft: 14969 corp: 30/811b lim: 45 exec/s: 51 rss: 70Mb L: 27/44 MS: 1 CopyPart- 00:06:52.714 [2024-12-02 15:22:38.330931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.330961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.331019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.331032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.331086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c4b0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.331100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.714 #52 NEW cov: 11839 ft: 14986 corp: 31/839b lim: 45 exec/s: 52 rss: 70Mb L: 28/44 MS: 1 CrossOver- 00:06:52.714 [2024-12-02 15:22:38.371048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:155e1515 cdw11:5e5e0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.371073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.371127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.371142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.371196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0f151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.371210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.714 #53 NEW cov: 11839 ft: 15002 corp: 32/872b lim: 45 exec/s: 53 rss: 70Mb L: 33/44 MS: 1 InsertRepeatedBytes- 00:06:52.714 [2024-12-02 15:22:38.411197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ff953c3c cdw11:4bb10001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.411221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.411276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c7c4e3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.411290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.714 [2024-12-02 15:22:38.411345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.714 [2024-12-02 15:22:38.411358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.714 #54 NEW cov: 11839 ft: 15023 corp: 33/900b lim: 45 exec/s: 54 rss: 70Mb L: 28/44 MS: 1 ChangeBit- 00:06:52.715 [2024-12-02 15:22:38.451265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15155b15 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.451289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.715 [2024-12-02 15:22:38.451346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.451360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.715 [2024-12-02 15:22:38.451413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.451429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.715 #55 NEW cov: 11839 ft: 15028 corp: 34/930b lim: 45 exec/s: 55 rss: 70Mb L: 30/44 MS: 1 PersAutoDict- DE: "\000\002"- 00:06:52.715 [2024-12-02 15:22:38.491196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:3fa24bb1 cdw11:d14e0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.491220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.715 [2024-12-02 15:22:38.491292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.491306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.715 #56 NEW cov: 11839 ft: 15031 corp: 35/954b lim: 45 exec/s: 56 rss: 70Mb L: 24/44 MS: 1 EraseBytes- 00:06:52.715 [2024-12-02 15:22:38.531537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15150a15 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.531562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.715 [2024-12-02 15:22:38.531619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.531633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.715 [2024-12-02 15:22:38.531689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.715 [2024-12-02 15:22:38.531703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.974 #58 NEW cov: 11839 ft: 15041 corp: 36/982b lim: 45 exec/s: 58 rss: 70Mb L: 28/44 MS: 2 ShuffleBytes-CrossOver- 00:06:52.974 [2024-12-02 15:22:38.571759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4bb1ff95 cdw11:3fa20006 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.571784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.571836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.571850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.571905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3c3c3c3c cdw11:3c3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.571935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.571992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:3c3c3c3c cdw11:2b3c0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.572005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.974 #59 NEW cov: 11839 ft: 15042 corp: 37/1022b lim: 45 exec/s: 59 rss: 70Mb L: 40/44 MS: 1 InsertRepeatedBytes- 00:06:52.974 [2024-12-02 15:22:38.611599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15155b15 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.611624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.611681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.611694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.974 #60 NEW cov: 11839 ft: 15108 corp: 38/1041b lim: 45 exec/s: 60 rss: 70Mb L: 19/44 MS: 1 EraseBytes- 00:06:52.974 [2024-12-02 15:22:38.652044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:954b15ff cdw11:b13f0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.652069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.652123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.652137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.652189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.652202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.974 [2024-12-02 15:22:38.652253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0f151515 cdw11:0b150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.974 [2024-12-02 15:22:38.652266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.974 #61 NEW cov: 11839 ft: 15172 corp: 39/1077b lim: 45 exec/s: 61 rss: 70Mb L: 36/44 MS: 1 PersAutoDict- DE: "\377\225K\261?\242\321N"- 00:06:52.974 [2024-12-02 15:22:38.691958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:001f0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.691982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.975 [2024-12-02 15:22:38.692052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.692066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.975 [2024-12-02 15:22:38.692120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00100000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.692133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.975 #62 NEW cov: 11839 ft: 15174 corp: 40/1108b lim: 45 exec/s: 62 rss: 70Mb L: 31/44 MS: 1 ChangeBinInt- 00:06:52.975 [2024-12-02 15:22:38.732093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.732117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.975 [2024-12-02 15:22:38.732172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.732186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.975 [2024-12-02 15:22:38.732237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.732251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.975 #63 NEW cov: 11839 ft: 15183 corp: 41/1139b lim: 45 exec/s: 63 rss: 70Mb L: 31/44 MS: 1 CopyPart- 00:06:52.975 [2024-12-02 15:22:38.771895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0a150f15 cdw11:0f150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:52.975 [2024-12-02 15:22:38.771920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.235 #66 NEW cov: 11839 ft: 15196 corp: 42/1152b lim: 45 exec/s: 66 rss: 70Mb L: 13/44 MS: 3 ShuffleBytes-CrossOver-CrossOver- 00:06:53.235 [2024-12-02 15:22:38.812529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15150015 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.812554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.812622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:79790079 cdw11:79790003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.812636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.812688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:79797979 cdw11:79790003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.812701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.812758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:79797979 cdw11:79790003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.812772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.235 #67 NEW cov: 11839 ft: 15206 corp: 43/1196b lim: 45 exec/s: 67 rss: 70Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:06:53.235 [2024-12-02 15:22:38.852468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.852492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.852561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.852575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.852626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.852640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.235 #68 NEW cov: 11839 ft: 15212 corp: 44/1227b lim: 45 exec/s: 68 rss: 70Mb L: 31/44 MS: 1 ShuffleBytes- 00:06:53.235 [2024-12-02 15:22:38.892956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.892982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.893037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.893050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.893103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.893119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.893171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.893184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.235 [2024-12-02 15:22:38.893237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:8 nsid:0 cdw10:15151515 cdw11:15150000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:53.235 [2024-12-02 15:22:38.893250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:53.235 #69 NEW cov: 11839 ft: 15277 corp: 45/1272b lim: 45 exec/s: 34 rss: 70Mb L: 45/45 MS: 1 CopyPart- 00:06:53.235 #69 DONE cov: 11839 ft: 15277 corp: 45/1272b lim: 45 exec/s: 34 rss: 70Mb 00:06:53.235 ###### Recommended dictionary. ###### 00:06:53.235 "\377\225K\261?\242\321N" # Uses: 3 00:06:53.235 "\000\002" # Uses: 1 00:06:53.235 ###### End of recommended dictionary. ###### 00:06:53.235 Done 69 runs in 2 second(s) 00:06:53.235 15:22:39 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:06:53.235 15:22:39 -- ../common.sh@72 -- # (( i++ )) 00:06:53.235 15:22:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:53.235 15:22:39 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:06:53.235 15:22:39 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:06:53.235 15:22:39 -- nvmf/run.sh@24 -- # local timen=1 00:06:53.235 15:22:39 -- nvmf/run.sh@25 -- # local core=0x1 00:06:53.235 15:22:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:53.236 15:22:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:06:53.236 15:22:39 -- nvmf/run.sh@29 -- # printf %02d 6 00:06:53.236 15:22:39 -- nvmf/run.sh@29 -- # port=4406 00:06:53.236 15:22:39 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:53.236 15:22:39 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:06:53.236 15:22:39 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:53.236 15:22:39 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:06:53.496 [2024-12-02 15:22:39.078173] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:53.496 [2024-12-02 15:22:39.078269] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1597917 ] 00:06:53.496 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.496 [2024-12-02 15:22:39.262407] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.756 [2024-12-02 15:22:39.326126] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:53.756 [2024-12-02 15:22:39.326251] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.756 [2024-12-02 15:22:39.384124] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:53.756 [2024-12-02 15:22:39.400439] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:06:53.756 INFO: Running with entropic power schedule (0xFF, 100). 00:06:53.756 INFO: Seed: 3869836472 00:06:53.756 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:53.756 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:53.756 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:53.756 INFO: A corpus is not provided, starting from an empty corpus 00:06:53.756 #2 INITED exec/s: 0 rss: 60Mb 00:06:53.756 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:53.756 This may also happen if the target rejected all inputs we tried so far 00:06:53.756 [2024-12-02 15:22:39.470028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:53.756 [2024-12-02 15:22:39.470068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.015 NEW_FUNC[1/669]: 0x445308 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:06:54.015 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:54.015 #3 NEW cov: 11529 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:06:54.015 [2024-12-02 15:22:39.820805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000290a cdw11:00000000 00:06:54.015 [2024-12-02 15:22:39.820855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #4 NEW cov: 11642 ft: 12062 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:06:54.274 [2024-12-02 15:22:39.861022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:39.861051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #5 NEW cov: 11648 ft: 12322 corp: 4/7b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CMP- DE: "\000\000"- 00:06:54.274 [2024-12-02 15:22:39.900802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:39.900829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #8 NEW cov: 11733 ft: 12535 corp: 5/10b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 3 ChangeBit-ShuffleBytes-PersAutoDict- DE: "\000\000"- 00:06:54.274 [2024-12-02 15:22:39.941153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000418a cdw11:00000000 00:06:54.274 [2024-12-02 15:22:39.941183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 [2024-12-02 15:22:39.941296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:39.941313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.274 #9 NEW cov: 11733 ft: 12802 corp: 6/14b lim: 10 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertByte- 00:06:54.274 [2024-12-02 15:22:39.981082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:54.274 [2024-12-02 15:22:39.981110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 #10 NEW cov: 11733 ft: 12892 corp: 7/16b lim: 10 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 CMP- DE: "\377\377"- 00:06:54.274 [2024-12-02 15:22:40.021534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000a0a0 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:40.021562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 [2024-12-02 15:22:40.021685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a0a0 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:40.021703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.274 [2024-12-02 15:22:40.021833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a0a0 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:40.021854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.274 #11 NEW cov: 11733 ft: 13120 corp: 8/23b lim: 10 exec/s: 0 rss: 68Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:06:54.274 [2024-12-02 15:22:40.061524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:40.061552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.274 [2024-12-02 15:22:40.061663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.274 [2024-12-02 15:22:40.061679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.274 #12 NEW cov: 11733 ft: 13136 corp: 9/27b lim: 10 exec/s: 0 rss: 68Mb L: 4/7 MS: 1 CrossOver- 00:06:54.533 [2024-12-02 15:22:40.101515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a29 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.101543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.533 #13 NEW cov: 11733 ft: 13311 corp: 10/29b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 1 ShuffleBytes- 00:06:54.533 [2024-12-02 15:22:40.141858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.141885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.533 [2024-12-02 15:22:40.142009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.142024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.533 #14 NEW cov: 11733 ft: 13435 corp: 11/33b lim: 10 exec/s: 0 rss: 68Mb L: 4/7 MS: 1 CrossOver- 00:06:54.533 [2024-12-02 15:22:40.181896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008400 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.181926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.533 [2024-12-02 15:22:40.182054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.182071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.533 #15 NEW cov: 11733 ft: 13465 corp: 12/37b lim: 10 exec/s: 0 rss: 68Mb L: 4/7 MS: 1 ChangeBinInt- 00:06:54.533 [2024-12-02 15:22:40.221896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.221925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.533 #16 NEW cov: 11733 ft: 13474 corp: 13/40b lim: 10 exec/s: 0 rss: 68Mb L: 3/7 MS: 1 EraseBytes- 00:06:54.533 [2024-12-02 15:22:40.262357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a8a cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.262385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.533 [2024-12-02 15:22:40.262498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.262515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.533 [2024-12-02 15:22:40.262629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.262648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.533 #17 NEW cov: 11733 ft: 13494 corp: 14/46b lim: 10 exec/s: 0 rss: 69Mb L: 6/7 MS: 1 CopyPart- 00:06:54.533 [2024-12-02 15:22:40.312174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a10 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.312203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.533 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:54.533 #18 NEW cov: 11756 ft: 13517 corp: 15/49b lim: 10 exec/s: 0 rss: 69Mb L: 3/7 MS: 1 ChangeBit- 00:06:54.533 [2024-12-02 15:22:40.352278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a04 cdw11:00000000 00:06:54.533 [2024-12-02 15:22:40.352306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.793 #19 NEW cov: 11756 ft: 13538 corp: 16/52b lim: 10 exec/s: 0 rss: 69Mb L: 3/7 MS: 1 ChangeBinInt- 00:06:54.793 [2024-12-02 15:22:40.392355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fffd cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.392382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.793 #20 NEW cov: 11756 ft: 13552 corp: 17/54b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ChangeBinInt- 00:06:54.793 [2024-12-02 15:22:40.432685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.432714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.793 [2024-12-02 15:22:40.432831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.432848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.793 #21 NEW cov: 11756 ft: 13564 corp: 18/58b lim: 10 exec/s: 21 rss: 69Mb L: 4/7 MS: 1 ChangeBinInt- 00:06:54.793 [2024-12-02 15:22:40.472603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.472631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.793 #22 NEW cov: 11756 ft: 13581 corp: 19/60b lim: 10 exec/s: 22 rss: 69Mb L: 2/7 MS: 1 CopyPart- 00:06:54.793 [2024-12-02 15:22:40.512744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.512772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.793 #23 NEW cov: 11756 ft: 13610 corp: 20/63b lim: 10 exec/s: 23 rss: 69Mb L: 3/7 MS: 1 ChangeByte- 00:06:54.793 [2024-12-02 15:22:40.552433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.552462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.793 #24 NEW cov: 11756 ft: 13638 corp: 21/66b lim: 10 exec/s: 24 rss: 69Mb L: 3/7 MS: 1 EraseBytes- 00:06:54.793 [2024-12-02 15:22:40.592936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000fffb cdw11:00000000 00:06:54.793 [2024-12-02 15:22:40.592964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.052 #25 NEW cov: 11756 ft: 13678 corp: 22/68b lim: 10 exec/s: 25 rss: 69Mb L: 2/7 MS: 1 ChangeBinInt- 00:06:55.052 [2024-12-02 15:22:40.633125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 00:06:55.052 [2024-12-02 15:22:40.633153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.052 #26 NEW cov: 11756 ft: 13692 corp: 23/71b lim: 10 exec/s: 26 rss: 69Mb L: 3/7 MS: 1 ShuffleBytes- 00:06:55.052 [2024-12-02 15:22:40.673279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000700 cdw11:00000000 00:06:55.052 [2024-12-02 15:22:40.673308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.052 #27 NEW cov: 11756 ft: 13694 corp: 24/73b lim: 10 exec/s: 27 rss: 69Mb L: 2/7 MS: 1 ChangeBinInt- 00:06:55.052 [2024-12-02 15:22:40.713398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008aff cdw11:00000000 00:06:55.052 [2024-12-02 15:22:40.713426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.052 #28 NEW cov: 11756 ft: 13697 corp: 25/76b lim: 10 exec/s: 28 rss: 69Mb L: 3/7 MS: 1 PersAutoDict- DE: "\377\377"- 00:06:55.052 [2024-12-02 15:22:40.753538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:55.052 [2024-12-02 15:22:40.753567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.052 #29 NEW cov: 11756 ft: 13698 corp: 26/78b lim: 10 exec/s: 29 rss: 69Mb L: 2/7 MS: 1 ShuffleBytes- 00:06:55.053 [2024-12-02 15:22:40.794045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:55.053 [2024-12-02 15:22:40.794072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.053 [2024-12-02 15:22:40.794187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff8a cdw11:00000000 00:06:55.053 [2024-12-02 15:22:40.794204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.053 [2024-12-02 15:22:40.794316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:55.053 [2024-12-02 15:22:40.794333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.053 #30 NEW cov: 11756 ft: 13776 corp: 27/85b lim: 10 exec/s: 30 rss: 69Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:06:55.053 [2024-12-02 15:22:40.833294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:55.053 [2024-12-02 15:22:40.833322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.053 #31 NEW cov: 11756 ft: 13817 corp: 28/87b lim: 10 exec/s: 31 rss: 69Mb L: 2/7 MS: 1 CopyPart- 00:06:55.312 [2024-12-02 15:22:40.873829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.873858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.312 #32 NEW cov: 11756 ft: 13842 corp: 29/89b lim: 10 exec/s: 32 rss: 69Mb L: 2/7 MS: 1 EraseBytes- 00:06:55.312 [2024-12-02 15:22:40.913973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.914002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.312 #33 NEW cov: 11756 ft: 13901 corp: 30/91b lim: 10 exec/s: 33 rss: 69Mb L: 2/7 MS: 1 EraseBytes- 00:06:55.312 [2024-12-02 15:22:40.954739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a04 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.954768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.312 [2024-12-02 15:22:40.954881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bfbf cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.954902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.312 [2024-12-02 15:22:40.955015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000bfbf cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.955032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.312 [2024-12-02 15:22:40.955147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bfbf cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.955166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.312 #34 NEW cov: 11756 ft: 14118 corp: 31/100b lim: 10 exec/s: 34 rss: 69Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:55.312 [2024-12-02 15:22:40.993759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a08 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:40.993786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.312 #35 NEW cov: 11756 ft: 14121 corp: 32/103b lim: 10 exec/s: 35 rss: 69Mb L: 3/9 MS: 1 ChangeBinInt- 00:06:55.312 [2024-12-02 15:22:41.034915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000000a0 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:41.034943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.312 [2024-12-02 15:22:41.035065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000a0a0 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:41.035084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.312 [2024-12-02 15:22:41.035195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000a0a0 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:41.035213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.312 [2024-12-02 15:22:41.035331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000000a0 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:41.035346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.312 #36 NEW cov: 11756 ft: 14159 corp: 33/111b lim: 10 exec/s: 36 rss: 70Mb L: 8/9 MS: 1 CrossOver- 00:06:55.312 [2024-12-02 15:22:41.074392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:41.074419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.312 #37 NEW cov: 11756 ft: 14161 corp: 34/113b lim: 10 exec/s: 37 rss: 70Mb L: 2/9 MS: 1 PersAutoDict- DE: "\000\000"- 00:06:55.312 [2024-12-02 15:22:41.114590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001800 cdw11:00000000 00:06:55.312 [2024-12-02 15:22:41.114617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.572 #39 NEW cov: 11756 ft: 14181 corp: 35/115b lim: 10 exec/s: 39 rss: 70Mb L: 2/9 MS: 2 EraseBytes-InsertByte- 00:06:55.572 [2024-12-02 15:22:41.154683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00007104 cdw11:00000000 00:06:55.572 [2024-12-02 15:22:41.154710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 #40 NEW cov: 11756 ft: 14183 corp: 36/118b lim: 10 exec/s: 40 rss: 70Mb L: 3/9 MS: 1 ChangeBinInt- 00:06:55.573 [2024-12-02 15:22:41.185254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.185281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.185403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.185420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.185539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a0a cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.185556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.573 #41 NEW cov: 11756 ft: 14191 corp: 37/124b lim: 10 exec/s: 41 rss: 70Mb L: 6/9 MS: 1 InsertRepeatedBytes- 00:06:55.573 [2024-12-02 15:22:41.225118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.225145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.225275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008a04 cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.225291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.573 #42 NEW cov: 11756 ft: 14193 corp: 38/128b lim: 10 exec/s: 42 rss: 70Mb L: 4/9 MS: 1 CopyPart- 00:06:55.573 [2024-12-02 15:22:41.265940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a04 cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.265968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.266088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000bfbf cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.266107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.266220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000044bf cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.266238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.266345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000bfbf cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.266360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.266472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000bf00 cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.266489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.573 #43 NEW cov: 11756 ft: 14248 corp: 39/138b lim: 10 exec/s: 43 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:06:55.573 [2024-12-02 15:22:41.305121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.305149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 #44 NEW cov: 11756 ft: 14258 corp: 40/140b lim: 10 exec/s: 44 rss: 70Mb L: 2/10 MS: 1 ShuffleBytes- 00:06:55.573 [2024-12-02 15:22:41.345237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.345264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 #45 NEW cov: 11756 ft: 14284 corp: 41/142b lim: 10 exec/s: 45 rss: 70Mb L: 2/10 MS: 1 PersAutoDict- DE: "\000\000"- 00:06:55.573 [2024-12-02 15:22:41.386106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.386137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.386261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000d0d cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.386279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.386391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000d0d cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.386406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.386526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000d0d cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.386542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.573 [2024-12-02 15:22:41.386654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00000d00 cdw11:00000000 00:06:55.573 [2024-12-02 15:22:41.386671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:55.833 #46 NEW cov: 11756 ft: 14297 corp: 42/152b lim: 10 exec/s: 46 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:06:55.833 [2024-12-02 15:22:41.425675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00008a00 cdw11:00000000 00:06:55.833 [2024-12-02 15:22:41.425706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.833 [2024-12-02 15:22:41.425837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00008500 cdw11:00000000 00:06:55.833 [2024-12-02 15:22:41.425854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.833 #47 NEW cov: 11756 ft: 14301 corp: 43/156b lim: 10 exec/s: 23 rss: 70Mb L: 4/10 MS: 1 ChangeByte- 00:06:55.833 #47 DONE cov: 11756 ft: 14301 corp: 43/156b lim: 10 exec/s: 23 rss: 70Mb 00:06:55.833 ###### Recommended dictionary. ###### 00:06:55.833 "\000\000" # Uses: 3 00:06:55.833 "\377\377" # Uses: 1 00:06:55.833 ###### End of recommended dictionary. ###### 00:06:55.833 Done 47 runs in 2 second(s) 00:06:55.833 15:22:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:06:55.833 15:22:41 -- ../common.sh@72 -- # (( i++ )) 00:06:55.833 15:22:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:55.833 15:22:41 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:06:55.833 15:22:41 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:06:55.833 15:22:41 -- nvmf/run.sh@24 -- # local timen=1 00:06:55.833 15:22:41 -- nvmf/run.sh@25 -- # local core=0x1 00:06:55.833 15:22:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:55.833 15:22:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:06:55.833 15:22:41 -- nvmf/run.sh@29 -- # printf %02d 7 00:06:55.833 15:22:41 -- nvmf/run.sh@29 -- # port=4407 00:06:55.833 15:22:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:55.833 15:22:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:06:55.833 15:22:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:55.833 15:22:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:06:55.833 [2024-12-02 15:22:41.609354] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:55.833 [2024-12-02 15:22:41.609421] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598396 ] 00:06:55.833 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.092 [2024-12-02 15:22:41.860268] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.351 [2024-12-02 15:22:41.949082] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:56.351 [2024-12-02 15:22:41.949221] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.351 [2024-12-02 15:22:42.007161] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:56.351 [2024-12-02 15:22:42.023475] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:06:56.351 INFO: Running with entropic power schedule (0xFF, 100). 00:06:56.351 INFO: Seed: 2197882733 00:06:56.351 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:56.351 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:56.351 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:56.351 INFO: A corpus is not provided, starting from an empty corpus 00:06:56.351 #2 INITED exec/s: 0 rss: 60Mb 00:06:56.351 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:56.351 This may also happen if the target rejected all inputs we tried so far 00:06:56.351 [2024-12-02 15:22:42.082140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000830b cdw11:00000000 00:06:56.351 [2024-12-02 15:22:42.082169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.610 NEW_FUNC[1/668]: 0x445d08 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:06:56.610 NEW_FUNC[2/668]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:56.610 #4 NEW cov: 11528 ft: 11523 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 2 ChangeBit-InsertByte- 00:06:56.610 [2024-12-02 15:22:42.402952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000310a cdw11:00000000 00:06:56.610 [2024-12-02 15:22:42.402996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.610 NEW_FUNC[1/1]: 0xe94458 in spdk_process_is_primary /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/env.c:290 00:06:56.610 #5 NEW cov: 11642 ft: 11975 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:06:56.869 [2024-12-02 15:22:42.442972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.442999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.869 [2024-12-02 15:22:42.443065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.443080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.869 #6 NEW cov: 11648 ft: 12389 corp: 4/10b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:06:56.869 [2024-12-02 15:22:42.483205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003800 cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.483231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.869 [2024-12-02 15:22:42.483282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.483296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.869 [2024-12-02 15:22:42.483343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.483359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.869 #7 NEW cov: 11733 ft: 12778 corp: 5/16b lim: 10 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 InsertByte- 00:06:56.869 [2024-12-02 15:22:42.523104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000830b cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.523129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.869 #8 NEW cov: 11733 ft: 12935 corp: 6/18b lim: 10 exec/s: 0 rss: 68Mb L: 2/6 MS: 1 ShuffleBytes- 00:06:56.869 [2024-12-02 15:22:42.563542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:56.869 [2024-12-02 15:22:42.563567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.869 [2024-12-02 15:22:42.563617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002424 cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.563630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.870 [2024-12-02 15:22:42.563680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.563693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.870 [2024-12-02 15:22:42.563741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.563754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.870 #9 NEW cov: 11733 ft: 13272 corp: 7/27b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:56.870 [2024-12-02 15:22:42.603319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008383 cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.603344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.870 #10 NEW cov: 11733 ft: 13338 corp: 8/29b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 CopyPart- 00:06:56.870 [2024-12-02 15:22:42.643534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000835c cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.643558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.870 [2024-12-02 15:22:42.643622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005c5c cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.643636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.870 #11 NEW cov: 11733 ft: 13440 corp: 9/34b lim: 10 exec/s: 0 rss: 68Mb L: 5/9 MS: 1 InsertRepeatedBytes- 00:06:56.870 [2024-12-02 15:22:42.683583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000050a cdw11:00000000 00:06:56.870 [2024-12-02 15:22:42.683608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 #12 NEW cov: 11733 ft: 13460 corp: 10/36b lim: 10 exec/s: 0 rss: 68Mb L: 2/9 MS: 1 ChangeByte- 00:06:57.129 [2024-12-02 15:22:42.723687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000832a cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.723712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 #13 NEW cov: 11733 ft: 13513 corp: 11/39b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:06:57.129 [2024-12-02 15:22:42.763811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.763848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 #14 NEW cov: 11733 ft: 13541 corp: 12/41b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 CopyPart- 00:06:57.129 [2024-12-02 15:22:42.793881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff0a cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.793905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 #15 NEW cov: 11733 ft: 13589 corp: 13/43b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 ChangeBinInt- 00:06:57.129 [2024-12-02 15:22:42.834036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ff31 cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.834060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 #16 NEW cov: 11733 ft: 13648 corp: 14/46b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 CrossOver- 00:06:57.129 [2024-12-02 15:22:42.874345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000832a cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.874370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 [2024-12-02 15:22:42.874420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000bff cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.874433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.129 [2024-12-02 15:22:42.874482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.874494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.129 #17 NEW cov: 11733 ft: 13675 corp: 15/53b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 InsertRepeatedBytes- 00:06:57.129 [2024-12-02 15:22:42.914245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00006d0b cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.914269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 #18 NEW cov: 11733 ft: 13693 corp: 16/55b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 ChangeByte- 00:06:57.129 [2024-12-02 15:22:42.944548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003800 cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.944572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.129 [2024-12-02 15:22:42.944624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000f0 cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.944637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.129 [2024-12-02 15:22:42.944685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.129 [2024-12-02 15:22:42.944698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.389 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:57.389 #24 NEW cov: 11756 ft: 13795 corp: 17/62b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 InsertByte- 00:06:57.389 [2024-12-02 15:22:42.984897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:42.984922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:42.984971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:42.984987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:42.985036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:42.985049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:42.985097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000023 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:42.985110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:42.985158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.389 [2024-12-02 15:22:42.985171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.389 #25 NEW cov: 11756 ft: 13872 corp: 18/72b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:06:57.389 [2024-12-02 15:22:43.024650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000832a cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.024676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.024726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0b cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.024743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.389 #26 NEW cov: 11756 ft: 13894 corp: 19/76b lim: 10 exec/s: 0 rss: 69Mb L: 4/10 MS: 1 CrossOver- 00:06:57.389 [2024-12-02 15:22:43.065121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.065145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.065220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.065233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.065280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.065293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.065341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000002a cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.065353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.065400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.065412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.389 #27 NEW cov: 11756 ft: 13912 corp: 20/86b lim: 10 exec/s: 27 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:06:57.389 [2024-12-02 15:22:43.104766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008310 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.104790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.389 #28 NEW cov: 11756 ft: 13936 corp: 21/89b lim: 10 exec/s: 28 rss: 69Mb L: 3/10 MS: 1 InsertByte- 00:06:57.389 [2024-12-02 15:22:43.145358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.145386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.145454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000240b cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.145468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.145517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b00 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.145530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.145579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000023 cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.145592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.389 [2024-12-02 15:22:43.145639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.389 [2024-12-02 15:22:43.145651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.389 #29 NEW cov: 11756 ft: 13954 corp: 22/99b lim: 10 exec/s: 29 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:06:57.389 [2024-12-02 15:22:43.185499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.390 [2024-12-02 15:22:43.185524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.390 [2024-12-02 15:22:43.185572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.390 [2024-12-02 15:22:43.185586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.390 [2024-12-02 15:22:43.185634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000024 cdw11:00000000 00:06:57.390 [2024-12-02 15:22:43.185647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.390 [2024-12-02 15:22:43.185694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000242a cdw11:00000000 00:06:57.390 [2024-12-02 15:22:43.185707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.390 [2024-12-02 15:22:43.185758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.390 [2024-12-02 15:22:43.185771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.648 #30 NEW cov: 11756 ft: 13959 corp: 23/109b lim: 10 exec/s: 30 rss: 69Mb L: 10/10 MS: 1 CopyPart- 00:06:57.648 [2024-12-02 15:22:43.225237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.225261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.225327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005c5c cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.225341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.648 #31 NEW cov: 11756 ft: 13972 corp: 24/114b lim: 10 exec/s: 31 rss: 70Mb L: 5/10 MS: 1 CrossOver- 00:06:57.648 [2024-12-02 15:22:43.265569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.265596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.265661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005c5c cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.265675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.265722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b0b cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.265739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.265789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000b5c cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.265801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.648 #32 NEW cov: 11756 ft: 13996 corp: 25/122b lim: 10 exec/s: 32 rss: 70Mb L: 8/10 MS: 1 CopyPart- 00:06:57.648 [2024-12-02 15:22:43.305800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.305824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.305899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.305913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.305962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.305975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.306024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000002a cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.306036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.648 [2024-12-02 15:22:43.306085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.306097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.648 #33 NEW cov: 11756 ft: 14003 corp: 26/132b lim: 10 exec/s: 33 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:06:57.648 [2024-12-02 15:22:43.345448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000031ff cdw11:00000000 00:06:57.648 [2024-12-02 15:22:43.345472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.649 #34 NEW cov: 11756 ft: 14017 corp: 27/135b lim: 10 exec/s: 34 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:06:57.649 [2024-12-02 15:22:43.385961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.385985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.386052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000240b cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.386066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.386114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b00 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.386127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.386178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000023 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.386191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.649 #35 NEW cov: 11756 ft: 14026 corp: 28/144b lim: 10 exec/s: 35 rss: 70Mb L: 9/10 MS: 1 EraseBytes- 00:06:57.649 [2024-12-02 15:22:43.426043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.426066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.426116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.426129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.426176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000083 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.426189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.426238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002a0a cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.426250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.649 #36 NEW cov: 11756 ft: 14031 corp: 29/153b lim: 10 exec/s: 36 rss: 70Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:06:57.649 [2024-12-02 15:22:43.466073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000832a cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.466097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.466148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000bff cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.466162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.649 [2024-12-02 15:22:43.466209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff01 cdw11:00000000 00:06:57.649 [2024-12-02 15:22:43.466222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.908 #37 NEW cov: 11756 ft: 14038 corp: 30/160b lim: 10 exec/s: 37 rss: 70Mb L: 7/10 MS: 1 ChangeBinInt- 00:06:57.908 [2024-12-02 15:22:43.506181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b00 cdw11:00000000 00:06:57.908 [2024-12-02 15:22:43.506205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.506255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000023 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.506268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.506316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.506344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.909 #38 NEW cov: 11756 ft: 14056 corp: 31/166b lim: 10 exec/s: 38 rss: 70Mb L: 6/10 MS: 1 EraseBytes- 00:06:57.909 [2024-12-02 15:22:43.546503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a24 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.546528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.546597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000024 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.546611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.546662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002400 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.546674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.546723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000024 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.546742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.546792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.546804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.909 #39 NEW cov: 11756 ft: 14099 corp: 32/176b lim: 10 exec/s: 39 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:57.909 [2024-12-02 15:22:43.586516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b0b cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.586541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.586593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005c5c cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.586607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.586655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b0b cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.586667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.586715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000b58 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.586728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.909 #40 NEW cov: 11756 ft: 14103 corp: 33/184b lim: 10 exec/s: 40 rss: 70Mb L: 8/10 MS: 1 ChangeBit- 00:06:57.909 [2024-12-02 15:22:43.626672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.626698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.626753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002624 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.626767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.626815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000024 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.626829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.626878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000242a cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.626890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.626940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.626961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.909 #41 NEW cov: 11756 ft: 14106 corp: 34/194b lim: 10 exec/s: 41 rss: 70Mb L: 10/10 MS: 1 ChangeBit- 00:06:57.909 [2024-12-02 15:22:43.666746] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.666772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.666821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000240b cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.666834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.666884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000240b cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.666898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.666947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000b00 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.666959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.909 #42 NEW cov: 11756 ft: 14123 corp: 35/203b lim: 10 exec/s: 42 rss: 70Mb L: 9/10 MS: 1 CopyPart- 00:06:57.909 [2024-12-02 15:22:43.706729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.706760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.706809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.706822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.909 [2024-12-02 15:22:43.706872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002a00 cdw11:00000000 00:06:57.909 [2024-12-02 15:22:43.706885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.168 #43 NEW cov: 11756 ft: 14202 corp: 36/210b lim: 10 exec/s: 43 rss: 70Mb L: 7/10 MS: 1 EraseBytes- 00:06:58.168 [2024-12-02 15:22:43.746731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000b2a cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.746759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.746824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000830a cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.746838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.168 #44 NEW cov: 11756 ft: 14266 corp: 37/214b lim: 10 exec/s: 44 rss: 70Mb L: 4/10 MS: 1 ShuffleBytes- 00:06:58.168 [2024-12-02 15:22:43.787082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.787106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.787157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.787171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.787220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.787236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.787285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.787298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.168 #45 NEW cov: 11756 ft: 14320 corp: 38/222b lim: 10 exec/s: 45 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:06:58.168 [2024-12-02 15:22:43.827000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.827026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.827075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00002a00 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.827088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.168 #46 NEW cov: 11756 ft: 14360 corp: 39/227b lim: 10 exec/s: 46 rss: 70Mb L: 5/10 MS: 1 EraseBytes- 00:06:58.168 [2024-12-02 15:22:43.867388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a24 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.867413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.867462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000024 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.867475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.867522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00002400 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.867535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.867586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000024 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.867598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.867646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000100a cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.867658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:58.168 #47 NEW cov: 11756 ft: 14409 corp: 40/237b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 1 ChangeBit- 00:06:58.168 [2024-12-02 15:22:43.907208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000240b cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.907232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.907283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000b00 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.907297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.168 #48 NEW cov: 11756 ft: 14419 corp: 41/242b lim: 10 exec/s: 48 rss: 70Mb L: 5/10 MS: 1 EraseBytes- 00:06:58.168 [2024-12-02 15:22:43.947408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000bff cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.947433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.168 [2024-12-02 15:22:43.947485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000d23 cdw11:00000000 00:06:58.168 [2024-12-02 15:22:43.947501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.169 [2024-12-02 15:22:43.947550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 00:06:58.169 [2024-12-02 15:22:43.947563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.169 #49 NEW cov: 11756 ft: 14424 corp: 42/248b lim: 10 exec/s: 49 rss: 70Mb L: 6/10 MS: 1 CMP- DE: "\377\015"- 00:06:58.428 [2024-12-02 15:22:43.987547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000832a cdw11:00000000 00:06:58.428 [2024-12-02 15:22:43.987572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:43.987623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000832a cdw11:00000000 00:06:58.428 [2024-12-02 15:22:43.987636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:43.987686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b01 cdw11:00000000 00:06:58.428 [2024-12-02 15:22:43.987699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.027747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000832a cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.027771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.027836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000832a cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.027850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.027898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000b01 cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.027911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.027960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000090a cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.027973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.428 #51 NEW cov: 11756 ft: 14466 corp: 43/256b lim: 10 exec/s: 51 rss: 70Mb L: 8/10 MS: 2 CopyPart-InsertByte- 00:06:58.428 [2024-12-02 15:22:44.067869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002424 cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.067894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.067944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000240b cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.067957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.068007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000240b cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.068019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.428 [2024-12-02 15:22:44.068068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000b10 cdw11:00000000 00:06:58.428 [2024-12-02 15:22:44.068081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.428 #52 NEW cov: 11756 ft: 14505 corp: 44/265b lim: 10 exec/s: 26 rss: 70Mb L: 9/10 MS: 1 ChangeByte- 00:06:58.428 #52 DONE cov: 11756 ft: 14505 corp: 44/265b lim: 10 exec/s: 26 rss: 70Mb 00:06:58.428 ###### Recommended dictionary. ###### 00:06:58.428 "\377\015" # Uses: 0 00:06:58.428 ###### End of recommended dictionary. ###### 00:06:58.428 Done 52 runs in 2 second(s) 00:06:58.428 15:22:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:06:58.428 15:22:44 -- ../common.sh@72 -- # (( i++ )) 00:06:58.428 15:22:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:58.428 15:22:44 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:06:58.428 15:22:44 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:06:58.428 15:22:44 -- nvmf/run.sh@24 -- # local timen=1 00:06:58.428 15:22:44 -- nvmf/run.sh@25 -- # local core=0x1 00:06:58.428 15:22:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:58.428 15:22:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:06:58.429 15:22:44 -- nvmf/run.sh@29 -- # printf %02d 8 00:06:58.429 15:22:44 -- nvmf/run.sh@29 -- # port=4408 00:06:58.429 15:22:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:58.429 15:22:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:06:58.429 15:22:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:58.429 15:22:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:06:58.688 [2024-12-02 15:22:44.252561] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.688 [2024-12-02 15:22:44.252639] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598940 ] 00:06:58.688 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.688 [2024-12-02 15:22:44.501481] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.948 [2024-12-02 15:22:44.589780] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:58.948 [2024-12-02 15:22:44.589900] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.948 [2024-12-02 15:22:44.647526] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:58.948 [2024-12-02 15:22:44.663860] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:06:58.948 INFO: Running with entropic power schedule (0xFF, 100). 00:06:58.948 INFO: Seed: 545896423 00:06:58.948 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:58.948 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:58.948 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:58.948 INFO: A corpus is not provided, starting from an empty corpus 00:06:58.948 [2024-12-02 15:22:44.719012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.948 [2024-12-02 15:22:44.719039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.948 #2 INITED cov: 11552 ft: 11558 corp: 1/1b exec/s: 0 rss: 67Mb 00:06:58.948 [2024-12-02 15:22:44.749120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.948 [2024-12-02 15:22:44.749146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.948 [2024-12-02 15:22:44.749198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.948 [2024-12-02 15:22:44.749215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.466 NEW_FUNC[1/1]: 0x1c310f8 in accel_comp_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel_sw.c:554 00:06:59.466 #3 NEW cov: 11670 ft: 12648 corp: 2/3b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 CopyPart- 00:06:59.466 [2024-12-02 15:22:45.049919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.049949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.466 #4 NEW cov: 11676 ft: 12920 corp: 3/4b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 CrossOver- 00:06:59.466 [2024-12-02 15:22:45.089997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.090024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.466 #5 NEW cov: 11761 ft: 13278 corp: 4/5b lim: 5 exec/s: 0 rss: 69Mb L: 1/2 MS: 1 ChangeBit- 00:06:59.466 [2024-12-02 15:22:45.130391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.130417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.466 [2024-12-02 15:22:45.130476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.130490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.466 [2024-12-02 15:22:45.130546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.130559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.466 #6 NEW cov: 11761 ft: 13604 corp: 5/8b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:06:59.466 [2024-12-02 15:22:45.180545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.180571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.466 [2024-12-02 15:22:45.180632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.180645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.466 #7 NEW cov: 11761 ft: 13785 corp: 6/10b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ChangeBit- 00:06:59.466 [2024-12-02 15:22:45.220488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.220513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.466 [2024-12-02 15:22:45.220587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.220601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.466 #8 NEW cov: 11761 ft: 13856 corp: 7/12b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 InsertByte- 00:06:59.466 [2024-12-02 15:22:45.260583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.260608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.466 [2024-12-02 15:22:45.260681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.466 [2024-12-02 15:22:45.260696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.725 #9 NEW cov: 11761 ft: 13926 corp: 8/14b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ShuffleBytes- 00:06:59.725 [2024-12-02 15:22:45.300881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.300906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.725 [2024-12-02 15:22:45.300979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.300994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.725 [2024-12-02 15:22:45.301049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.301063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.725 #10 NEW cov: 11761 ft: 13981 corp: 9/17b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 ChangeBinInt- 00:06:59.725 [2024-12-02 15:22:45.340684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.340709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.725 #11 NEW cov: 11761 ft: 14083 corp: 10/18b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 EraseBytes- 00:06:59.725 [2024-12-02 15:22:45.380973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.380998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.725 [2024-12-02 15:22:45.381055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.381069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.725 #12 NEW cov: 11761 ft: 14123 corp: 11/20b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ChangeBit- 00:06:59.725 [2024-12-02 15:22:45.421068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.421093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.725 [2024-12-02 15:22:45.421152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.421166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.725 #13 NEW cov: 11761 ft: 14142 corp: 12/22b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:06:59.725 [2024-12-02 15:22:45.461349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.725 [2024-12-02 15:22:45.461377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.725 [2024-12-02 15:22:45.461436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.726 [2024-12-02 15:22:45.461449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.726 [2024-12-02 15:22:45.461505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.726 [2024-12-02 15:22:45.461518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.726 #14 NEW cov: 11761 ft: 14190 corp: 13/25b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:06:59.726 [2024-12-02 15:22:45.501153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.726 [2024-12-02 15:22:45.501179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.726 #15 NEW cov: 11761 ft: 14229 corp: 14/26b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 ShuffleBytes- 00:06:59.726 [2024-12-02 15:22:45.541272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.726 [2024-12-02 15:22:45.541298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 #16 NEW cov: 11761 ft: 14267 corp: 15/27b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 CrossOver- 00:06:59.985 [2024-12-02 15:22:45.581692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.581717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.581779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.581793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.581864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.581878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.985 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:59.985 #17 NEW cov: 11784 ft: 14388 corp: 16/30b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CopyPart- 00:06:59.985 [2024-12-02 15:22:45.621867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.621893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.621951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.621965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.622019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.622035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.985 #18 NEW cov: 11784 ft: 14404 corp: 17/33b lim: 5 exec/s: 0 rss: 70Mb L: 3/3 MS: 1 ChangeBinInt- 00:06:59.985 [2024-12-02 15:22:45.662291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.662316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.662374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.662388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.662446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.662459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.662514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.662528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.662583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.662596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:59.985 #19 NEW cov: 11784 ft: 14732 corp: 18/38b lim: 5 exec/s: 0 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:06:59.985 [2024-12-02 15:22:45.702081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.702106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.702178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.702192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.702249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.702262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.985 #20 NEW cov: 11784 ft: 14750 corp: 19/41b lim: 5 exec/s: 20 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:06:59.985 [2024-12-02 15:22:45.742070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.742095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.742154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.742168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.985 #21 NEW cov: 11784 ft: 14774 corp: 20/43b lim: 5 exec/s: 21 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:06:59.985 [2024-12-02 15:22:45.782280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.782304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.782365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.985 [2024-12-02 15:22:45.782378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.985 [2024-12-02 15:22:45.782436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.986 [2024-12-02 15:22:45.782449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.245 #22 NEW cov: 11784 ft: 14795 corp: 21/46b lim: 5 exec/s: 22 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:00.245 [2024-12-02 15:22:45.822128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.822152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 #23 NEW cov: 11784 ft: 14816 corp: 22/47b lim: 5 exec/s: 23 rss: 70Mb L: 1/5 MS: 1 ShuffleBytes- 00:07:00.245 [2024-12-02 15:22:45.852205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.852230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 #24 NEW cov: 11784 ft: 14831 corp: 23/48b lim: 5 exec/s: 24 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:00.245 [2024-12-02 15:22:45.892498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.892523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 [2024-12-02 15:22:45.892582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.892596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.245 #25 NEW cov: 11784 ft: 14860 corp: 24/50b lim: 5 exec/s: 25 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:00.245 [2024-12-02 15:22:45.932920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.932946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 [2024-12-02 15:22:45.933006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.933019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.245 [2024-12-02 15:22:45.933075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.933088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.245 [2024-12-02 15:22:45.933144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.933159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.245 #26 NEW cov: 11784 ft: 14898 corp: 25/54b lim: 5 exec/s: 26 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:00.245 [2024-12-02 15:22:45.972539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:45.972564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 #27 NEW cov: 11784 ft: 14904 corp: 26/55b lim: 5 exec/s: 27 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:07:00.245 [2024-12-02 15:22:46.002628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:46.002654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 #28 NEW cov: 11784 ft: 14917 corp: 27/56b lim: 5 exec/s: 28 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:00.245 [2024-12-02 15:22:46.042930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:46.042955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.245 [2024-12-02 15:22:46.043029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.245 [2024-12-02 15:22:46.043043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.245 #29 NEW cov: 11784 ft: 14997 corp: 28/58b lim: 5 exec/s: 29 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:07:00.505 [2024-12-02 15:22:46.083027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.083052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.083109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.083122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.505 #30 NEW cov: 11784 ft: 15039 corp: 29/60b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:00.505 [2024-12-02 15:22:46.123148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.123173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.123232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.123246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.505 #31 NEW cov: 11784 ft: 15082 corp: 30/62b lim: 5 exec/s: 31 rss: 70Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:00.505 [2024-12-02 15:22:46.163444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.163468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.163527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.163543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.163598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.163611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.505 #32 NEW cov: 11784 ft: 15095 corp: 31/65b lim: 5 exec/s: 32 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:00.505 [2024-12-02 15:22:46.203730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.203773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.203832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.203845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.203901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.203914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.203971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.203984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.505 #33 NEW cov: 11784 ft: 15118 corp: 32/69b lim: 5 exec/s: 33 rss: 70Mb L: 4/5 MS: 1 InsertByte- 00:07:00.505 [2024-12-02 15:22:46.243715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.243745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.505 [2024-12-02 15:22:46.243804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.505 [2024-12-02 15:22:46.243817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.506 [2024-12-02 15:22:46.243874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.506 [2024-12-02 15:22:46.243887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.506 #34 NEW cov: 11784 ft: 15131 corp: 33/72b lim: 5 exec/s: 34 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:07:00.506 [2024-12-02 15:22:46.283650] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.506 [2024-12-02 15:22:46.283674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.506 [2024-12-02 15:22:46.283731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.506 [2024-12-02 15:22:46.283750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.506 #35 NEW cov: 11784 ft: 15144 corp: 34/74b lim: 5 exec/s: 35 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:00.768 [2024-12-02 15:22:46.324140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.324166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.324224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.324238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.324295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.324308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.324366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.324379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.768 #36 NEW cov: 11784 ft: 15161 corp: 35/78b lim: 5 exec/s: 36 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:07:00.768 [2024-12-02 15:22:46.363745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.363770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.768 #37 NEW cov: 11784 ft: 15168 corp: 36/79b lim: 5 exec/s: 37 rss: 70Mb L: 1/5 MS: 1 CopyPart- 00:07:00.768 [2024-12-02 15:22:46.393956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.393981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.394056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.394070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.768 #38 NEW cov: 11784 ft: 15182 corp: 37/81b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:07:00.768 [2024-12-02 15:22:46.434442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.434467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.434527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.434540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.434594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.434607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.434664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.434677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.768 #39 NEW cov: 11784 ft: 15200 corp: 38/85b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:00.768 [2024-12-02 15:22:46.474328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.474352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.768 [2024-12-02 15:22:46.474407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.768 [2024-12-02 15:22:46.474421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.769 [2024-12-02 15:22:46.474476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.474489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.769 #40 NEW cov: 11784 ft: 15279 corp: 39/88b lim: 5 exec/s: 40 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:00.769 [2024-12-02 15:22:46.514119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.514144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.769 #41 NEW cov: 11784 ft: 15290 corp: 40/89b lim: 5 exec/s: 41 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:00.769 [2024-12-02 15:22:46.554886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.554911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.769 [2024-12-02 15:22:46.554972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.554985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.769 [2024-12-02 15:22:46.555044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.555057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.769 [2024-12-02 15:22:46.555112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.555125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.769 [2024-12-02 15:22:46.555180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.769 [2024-12-02 15:22:46.555193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:00.769 #42 NEW cov: 11784 ft: 15291 corp: 41/94b lim: 5 exec/s: 42 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:01.107 [2024-12-02 15:22:46.594695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.594720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.107 [2024-12-02 15:22:46.594783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.594800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.107 [2024-12-02 15:22:46.594860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.594874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.107 #43 NEW cov: 11784 ft: 15361 corp: 42/97b lim: 5 exec/s: 43 rss: 70Mb L: 3/5 MS: 1 ChangeBinInt- 00:07:01.107 [2024-12-02 15:22:46.644832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.644858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.107 [2024-12-02 15:22:46.644918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.644931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.107 [2024-12-02 15:22:46.645004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.645018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.107 #44 NEW cov: 11784 ft: 15364 corp: 43/100b lim: 5 exec/s: 44 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:07:01.107 [2024-12-02 15:22:46.684818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.684844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.107 [2024-12-02 15:22:46.684904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.107 [2024-12-02 15:22:46.684919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.107 #45 NEW cov: 11784 ft: 15365 corp: 44/102b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:07:01.107 #45 DONE cov: 11784 ft: 15365 corp: 44/102b lim: 5 exec/s: 22 rss: 70Mb 00:07:01.107 Done 45 runs in 2 second(s) 00:07:01.107 15:22:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:01.107 15:22:46 -- ../common.sh@72 -- # (( i++ )) 00:07:01.107 15:22:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:01.107 15:22:46 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:01.107 15:22:46 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:01.107 15:22:46 -- nvmf/run.sh@24 -- # local timen=1 00:07:01.107 15:22:46 -- nvmf/run.sh@25 -- # local core=0x1 00:07:01.108 15:22:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:01.108 15:22:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:01.108 15:22:46 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:01.108 15:22:46 -- nvmf/run.sh@29 -- # port=4409 00:07:01.108 15:22:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:01.108 15:22:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:01.108 15:22:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:01.108 15:22:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:01.373 [2024-12-02 15:22:46.870620] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:01.373 [2024-12-02 15:22:46.870690] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1599330 ] 00:07:01.373 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.373 [2024-12-02 15:22:47.123144] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.632 [2024-12-02 15:22:47.205643] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:01.632 [2024-12-02 15:22:47.205796] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.632 [2024-12-02 15:22:47.263483] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:01.632 [2024-12-02 15:22:47.279813] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:01.632 INFO: Running with entropic power schedule (0xFF, 100). 00:07:01.632 INFO: Seed: 3159897234 00:07:01.632 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:01.632 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:01.632 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:01.632 INFO: A corpus is not provided, starting from an empty corpus 00:07:01.632 [2024-12-02 15:22:47.324981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.632 [2024-12-02 15:22:47.325010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.632 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 67Mb 00:07:01.632 [2024-12-02 15:22:47.355152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.632 [2024-12-02 15:22:47.355177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.632 [2024-12-02 15:22:47.355245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.632 [2024-12-02 15:22:47.355259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.632 #3 NEW cov: 11670 ft: 12680 corp: 2/3b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:07:01.632 [2024-12-02 15:22:47.395222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.632 [2024-12-02 15:22:47.395247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.632 [2024-12-02 15:22:47.395316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.632 [2024-12-02 15:22:47.395330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.632 #4 NEW cov: 11676 ft: 12843 corp: 3/5b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:07:01.632 [2024-12-02 15:22:47.435174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.632 [2024-12-02 15:22:47.435198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 #5 NEW cov: 11761 ft: 13155 corp: 4/6b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 EraseBytes- 00:07:01.892 [2024-12-02 15:22:47.475340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.475368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 #6 NEW cov: 11761 ft: 13345 corp: 5/7b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeByte- 00:07:01.892 [2024-12-02 15:22:47.515423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.515448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 #7 NEW cov: 11761 ft: 13441 corp: 6/8b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 EraseBytes- 00:07:01.892 [2024-12-02 15:22:47.555535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.555560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 #8 NEW cov: 11761 ft: 13525 corp: 7/9b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ShuffleBytes- 00:07:01.892 [2024-12-02 15:22:47.595978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.596003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 [2024-12-02 15:22:47.596056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.596069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.892 [2024-12-02 15:22:47.596121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.596134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.892 #9 NEW cov: 11761 ft: 13726 corp: 8/12b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:07:01.892 [2024-12-02 15:22:47.635783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.635808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 #10 NEW cov: 11761 ft: 13833 corp: 9/13b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 ChangeByte- 00:07:01.892 [2024-12-02 15:22:47.666325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.666349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.892 [2024-12-02 15:22:47.666403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.666416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.892 [2024-12-02 15:22:47.666469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.666499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.892 [2024-12-02 15:22:47.666553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.666566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.892 #11 NEW cov: 11761 ft: 14109 corp: 10/17b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:01.892 [2024-12-02 15:22:47.705994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:01.892 [2024-12-02 15:22:47.706019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 #12 NEW cov: 11761 ft: 14138 corp: 11/18b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ChangeBit- 00:07:02.152 [2024-12-02 15:22:47.746254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.746278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 [2024-12-02 15:22:47.746331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.746345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.152 #13 NEW cov: 11761 ft: 14173 corp: 12/20b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 CopyPart- 00:07:02.152 [2024-12-02 15:22:47.786360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.786384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 [2024-12-02 15:22:47.786452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.786465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.152 #14 NEW cov: 11761 ft: 14193 corp: 13/22b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 CrossOver- 00:07:02.152 [2024-12-02 15:22:47.826337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.826362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 #15 NEW cov: 11761 ft: 14217 corp: 14/23b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ChangeByte- 00:07:02.152 [2024-12-02 15:22:47.866587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.866611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 [2024-12-02 15:22:47.866666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.866679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.152 #16 NEW cov: 11761 ft: 14268 corp: 15/25b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:07:02.152 [2024-12-02 15:22:47.906725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.906752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 [2024-12-02 15:22:47.906804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.906817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.152 #17 NEW cov: 11761 ft: 14345 corp: 16/27b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:07:02.152 [2024-12-02 15:22:47.946822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.946846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.152 [2024-12-02 15:22:47.946902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.152 [2024-12-02 15:22:47.946915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.152 #18 NEW cov: 11761 ft: 14380 corp: 17/29b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 ChangeBit- 00:07:02.412 [2024-12-02 15:22:47.987075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:47.987100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:47.987155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:47.987168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:47.987218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:47.987231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.412 #19 NEW cov: 11761 ft: 14408 corp: 18/32b lim: 5 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:02.412 [2024-12-02 15:22:48.027055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.027079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.027132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.027145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.412 #20 NEW cov: 11761 ft: 14420 corp: 19/34b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:07:02.412 [2024-12-02 15:22:48.067471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.067494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.067548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.067562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.067615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.067628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.067681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.067697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.412 #21 NEW cov: 11761 ft: 14496 corp: 20/38b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertByte- 00:07:02.412 [2024-12-02 15:22:48.107263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.107287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.107340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.107354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.412 #22 NEW cov: 11761 ft: 14513 corp: 21/40b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 CrossOver- 00:07:02.412 [2024-12-02 15:22:48.147400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.147424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.147493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.147506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.412 #23 NEW cov: 11761 ft: 14533 corp: 22/42b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 InsertByte- 00:07:02.412 [2024-12-02 15:22:48.187510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.187536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.412 [2024-12-02 15:22:48.187590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.412 [2024-12-02 15:22:48.187604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.672 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:02.672 #24 NEW cov: 11784 ft: 14592 corp: 23/44b lim: 5 exec/s: 24 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:02.672 [2024-12-02 15:22:48.488915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.672 [2024-12-02 15:22:48.488948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.672 [2024-12-02 15:22:48.489005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.672 [2024-12-02 15:22:48.489019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.672 [2024-12-02 15:22:48.489077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.672 [2024-12-02 15:22:48.489091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.672 [2024-12-02 15:22:48.489147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.672 [2024-12-02 15:22:48.489165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.672 [2024-12-02 15:22:48.489222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.672 [2024-12-02 15:22:48.489235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:02.931 #25 NEW cov: 11784 ft: 14662 corp: 24/49b lim: 5 exec/s: 25 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\001\002\000\000"- 00:07:02.931 [2024-12-02 15:22:48.528476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.931 [2024-12-02 15:22:48.528502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.931 [2024-12-02 15:22:48.528560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.931 [2024-12-02 15:22:48.528574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.931 #26 NEW cov: 11784 ft: 14715 corp: 25/51b lim: 5 exec/s: 26 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:02.931 [2024-12-02 15:22:48.568912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.931 [2024-12-02 15:22:48.568938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.931 [2024-12-02 15:22:48.568994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.931 [2024-12-02 15:22:48.569007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.931 [2024-12-02 15:22:48.569063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.931 [2024-12-02 15:22:48.569076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.569131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.569144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.932 #27 NEW cov: 11784 ft: 14723 corp: 26/55b lim: 5 exec/s: 27 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:07:02.932 [2024-12-02 15:22:48.609040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.609065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.609121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.609134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.609187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.609201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.609255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.609271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.932 #28 NEW cov: 11784 ft: 14750 corp: 27/59b lim: 5 exec/s: 28 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:07:02.932 [2024-12-02 15:22:48.648631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.648656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.932 #29 NEW cov: 11784 ft: 14833 corp: 28/60b lim: 5 exec/s: 29 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:02.932 [2024-12-02 15:22:48.688939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.688964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.689023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.689037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.932 #30 NEW cov: 11784 ft: 14911 corp: 29/62b lim: 5 exec/s: 30 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:02.932 [2024-12-02 15:22:48.729340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.729365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.729425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.729439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.729494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.729507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.932 [2024-12-02 15:22:48.729563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:02.932 [2024-12-02 15:22:48.729576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.191 #31 NEW cov: 11784 ft: 14918 corp: 30/66b lim: 5 exec/s: 31 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:03.191 [2024-12-02 15:22:48.769448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.191 [2024-12-02 15:22:48.769473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.191 [2024-12-02 15:22:48.769531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.191 [2024-12-02 15:22:48.769545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.191 [2024-12-02 15:22:48.769616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.191 [2024-12-02 15:22:48.769630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.191 [2024-12-02 15:22:48.769691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.191 [2024-12-02 15:22:48.769705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.191 #32 NEW cov: 11784 ft: 14953 corp: 31/70b lim: 5 exec/s: 32 rss: 70Mb L: 4/5 MS: 1 ChangeBit- 00:07:03.191 [2024-12-02 15:22:48.809247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.809272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.809329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.809343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.192 #33 NEW cov: 11784 ft: 15036 corp: 32/72b lim: 5 exec/s: 33 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:03.192 [2024-12-02 15:22:48.849236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.849261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.192 #34 NEW cov: 11784 ft: 15055 corp: 33/73b lim: 5 exec/s: 34 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:03.192 [2024-12-02 15:22:48.889971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.889996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.890052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.890066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.890120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.890133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.890189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.890203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.890257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.890271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.192 #35 NEW cov: 11784 ft: 15069 corp: 34/78b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:03.192 [2024-12-02 15:22:48.929614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.929639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.929698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.929714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.192 #36 NEW cov: 11784 ft: 15092 corp: 35/80b lim: 5 exec/s: 36 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:07:03.192 [2024-12-02 15:22:48.969902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.969927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.969984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.969998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:48.970053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:48.970067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.192 #37 NEW cov: 11784 ft: 15095 corp: 36/83b lim: 5 exec/s: 37 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:03.192 [2024-12-02 15:22:49.009903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:49.009928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.192 [2024-12-02 15:22:49.009986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.192 [2024-12-02 15:22:49.010000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.452 #38 NEW cov: 11784 ft: 15098 corp: 37/85b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:03.452 [2024-12-02 15:22:49.050333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.050357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.050413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.050426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.050480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.050510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.050567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.050580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.452 #39 NEW cov: 11784 ft: 15108 corp: 38/89b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 ChangeByte- 00:07:03.452 [2024-12-02 15:22:49.090083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.090108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.090170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.090184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.452 #40 NEW cov: 11784 ft: 15120 corp: 39/91b lim: 5 exec/s: 40 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:03.452 [2024-12-02 15:22:49.130041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.130066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.452 #41 NEW cov: 11784 ft: 15126 corp: 40/92b lim: 5 exec/s: 41 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:07:03.452 [2024-12-02 15:22:49.170794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.170818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.170875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.170889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.170943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.170956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.171010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.171023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.171078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.171091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:03.452 #42 NEW cov: 11784 ft: 15164 corp: 41/97b lim: 5 exec/s: 42 rss: 70Mb L: 5/5 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:07:03.452 [2024-12-02 15:22:49.220811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.220837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.220910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.220935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.220991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.221004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.452 [2024-12-02 15:22:49.221060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.221076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:03.452 #43 NEW cov: 11784 ft: 15208 corp: 42/101b lim: 5 exec/s: 43 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:07:03.452 [2024-12-02 15:22:49.260433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.452 [2024-12-02 15:22:49.260457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.712 #44 NEW cov: 11784 ft: 15244 corp: 43/102b lim: 5 exec/s: 44 rss: 70Mb L: 1/5 MS: 1 CrossOver- 00:07:03.712 [2024-12-02 15:22:49.300872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.712 [2024-12-02 15:22:49.300896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.712 [2024-12-02 15:22:49.300955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.712 [2024-12-02 15:22:49.300969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.712 [2024-12-02 15:22:49.301014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.712 [2024-12-02 15:22:49.301045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.712 #45 NEW cov: 11784 ft: 15251 corp: 44/105b lim: 5 exec/s: 22 rss: 70Mb L: 3/5 MS: 1 EraseBytes- 00:07:03.712 #45 DONE cov: 11784 ft: 15251 corp: 44/105b lim: 5 exec/s: 22 rss: 70Mb 00:07:03.712 ###### Recommended dictionary. ###### 00:07:03.712 "\001\002\000\000" # Uses: 1 00:07:03.712 ###### End of recommended dictionary. ###### 00:07:03.712 Done 45 runs in 2 second(s) 00:07:03.712 15:22:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:03.712 15:22:49 -- ../common.sh@72 -- # (( i++ )) 00:07:03.712 15:22:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:03.712 15:22:49 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:03.712 15:22:49 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:03.712 15:22:49 -- nvmf/run.sh@24 -- # local timen=1 00:07:03.712 15:22:49 -- nvmf/run.sh@25 -- # local core=0x1 00:07:03.712 15:22:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:03.712 15:22:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:03.712 15:22:49 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:03.712 15:22:49 -- nvmf/run.sh@29 -- # port=4410 00:07:03.712 15:22:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:03.712 15:22:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:03.712 15:22:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:03.713 15:22:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:03.713 [2024-12-02 15:22:49.492287] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:03.713 [2024-12-02 15:22:49.492356] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1599780 ] 00:07:03.713 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.972 [2024-12-02 15:22:49.743116] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.231 [2024-12-02 15:22:49.828510] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:04.231 [2024-12-02 15:22:49.828635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.231 [2024-12-02 15:22:49.886933] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:04.231 [2024-12-02 15:22:49.903253] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:04.231 INFO: Running with entropic power schedule (0xFF, 100). 00:07:04.231 INFO: Seed: 1487932771 00:07:04.231 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:04.231 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:04.231 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:04.231 INFO: A corpus is not provided, starting from an empty corpus 00:07:04.231 #2 INITED exec/s: 0 rss: 60Mb 00:07:04.231 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:04.231 This may also happen if the target rejected all inputs we tried so far 00:07:04.231 [2024-12-02 15:22:49.974150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.231 [2024-12-02 15:22:49.974188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.231 [2024-12-02 15:22:49.974279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.231 [2024-12-02 15:22:49.974296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.231 [2024-12-02 15:22:49.974373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.231 [2024-12-02 15:22:49.974389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.491 NEW_FUNC[1/670]: 0x447688 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:04.491 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:04.491 #4 NEW cov: 11580 ft: 11581 corp: 2/32b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:04.750 [2024-12-02 15:22:50.314311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.750 [2024-12-02 15:22:50.314358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.750 [2024-12-02 15:22:50.314504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.750 [2024-12-02 15:22:50.314524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.750 [2024-12-02 15:22:50.314660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.750 [2024-12-02 15:22:50.314682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.750 #5 NEW cov: 11693 ft: 12353 corp: 3/63b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeByte- 00:07:04.750 [2024-12-02 15:22:50.364306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.750 [2024-12-02 15:22:50.364336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.750 [2024-12-02 15:22:50.364476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.750 [2024-12-02 15:22:50.364493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.364625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.364645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.751 #6 NEW cov: 11699 ft: 12529 corp: 4/94b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeBit- 00:07:04.751 [2024-12-02 15:22:50.404394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.404422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.404559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.404577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.404705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.404721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.751 #7 NEW cov: 11784 ft: 12761 corp: 5/125b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ShuffleBytes- 00:07:04.751 [2024-12-02 15:22:50.444542] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.444570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.444702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.444737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.444863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.444879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.751 #8 NEW cov: 11784 ft: 12832 corp: 6/156b lim: 40 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ShuffleBytes- 00:07:04.751 [2024-12-02 15:22:50.494272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a112a29 cdw11:51b34b96 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.494300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.751 #9 NEW cov: 11784 ft: 13276 corp: 7/165b lim: 40 exec/s: 0 rss: 68Mb L: 9/31 MS: 1 CMP- DE: "\021*)Q\263K\226\000"- 00:07:04.751 [2024-12-02 15:22:50.534971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.534999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.535142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.535162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.535283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.535300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.751 [2024-12-02 15:22:50.535422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffff66 cdw11:66666666 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:04.751 [2024-12-02 15:22:50.535438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.751 #10 NEW cov: 11784 ft: 13800 corp: 8/201b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:05.011 [2024-12-02 15:22:50.585018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:24ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.585045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.585182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.585199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.585327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.585343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.011 #14 NEW cov: 11784 ft: 13841 corp: 9/225b lim: 40 exec/s: 0 rss: 68Mb L: 24/36 MS: 4 InsertByte-EraseBytes-CopyPart-InsertRepeatedBytes- 00:07:05.011 [2024-12-02 15:22:50.625102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.625128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.625265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.625281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.625417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.625433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.011 #15 NEW cov: 11784 ft: 13914 corp: 10/256b lim: 40 exec/s: 0 rss: 68Mb L: 31/36 MS: 1 ShuffleBytes- 00:07:05.011 [2024-12-02 15:22:50.665237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.665263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.665425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.665442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.665577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.665597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.011 #16 NEW cov: 11784 ft: 13952 corp: 11/287b lim: 40 exec/s: 0 rss: 68Mb L: 31/36 MS: 1 ChangeBinInt- 00:07:05.011 [2024-12-02 15:22:50.705175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffffaff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.705202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.705327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:3dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.705344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.011 #17 NEW cov: 11784 ft: 14168 corp: 12/309b lim: 40 exec/s: 0 rss: 69Mb L: 22/36 MS: 1 EraseBytes- 00:07:05.011 [2024-12-02 15:22:50.745013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a112a51 cdw11:51b34b96 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.745041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 #18 NEW cov: 11784 ft: 14260 corp: 13/318b lim: 40 exec/s: 0 rss: 69Mb L: 9/36 MS: 1 CopyPart- 00:07:05.011 [2024-12-02 15:22:50.785777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:858585ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.785805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.785935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.785951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.786081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.786097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.786227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:3dffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.786245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.011 #19 NEW cov: 11784 ft: 14331 corp: 14/352b lim: 40 exec/s: 0 rss: 69Mb L: 34/36 MS: 1 InsertRepeatedBytes- 00:07:05.011 [2024-12-02 15:22:50.825972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:858585ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.825999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.826125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.826142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.826274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.826291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.011 [2024-12-02 15:22:50.826431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.011 [2024-12-02 15:22:50.826447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.271 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:05.271 #20 NEW cov: 11807 ft: 14383 corp: 15/386b lim: 40 exec/s: 0 rss: 69Mb L: 34/36 MS: 1 CopyPart- 00:07:05.271 [2024-12-02 15:22:50.876118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.271 [2024-12-02 15:22:50.876146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.271 [2024-12-02 15:22:50.876285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.271 [2024-12-02 15:22:50.876303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.271 [2024-12-02 15:22:50.876439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.876455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:50.876584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:66666666 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.876601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.272 #21 NEW cov: 11807 ft: 14401 corp: 16/423b lim: 40 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 CopyPart- 00:07:05.272 [2024-12-02 15:22:50.926082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.926110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:50.926250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.926267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:50.926406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffff4bff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.926421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.272 #22 NEW cov: 11807 ft: 14465 corp: 17/454b lim: 40 exec/s: 22 rss: 69Mb L: 31/37 MS: 1 ChangeByte- 00:07:05.272 [2024-12-02 15:22:50.965926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.965952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:50.966080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:50.966097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.272 #23 NEW cov: 11807 ft: 14489 corp: 18/470b lim: 40 exec/s: 23 rss: 69Mb L: 16/37 MS: 1 EraseBytes- 00:07:05.272 [2024-12-02 15:22:51.006051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.006077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:51.006203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff4bff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.006220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.272 #24 NEW cov: 11807 ft: 14506 corp: 19/493b lim: 40 exec/s: 24 rss: 69Mb L: 23/37 MS: 1 EraseBytes- 00:07:05.272 [2024-12-02 15:22:51.046582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.046608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:51.046738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.046754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:51.046891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.046907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.272 [2024-12-02 15:22:51.047030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.047046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.272 #25 NEW cov: 11807 ft: 14528 corp: 20/531b lim: 40 exec/s: 25 rss: 69Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:05.272 [2024-12-02 15:22:51.086096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a112a51 cdw11:f6ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.272 [2024-12-02 15:22:51.086123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.532 #26 NEW cov: 11807 ft: 14541 corp: 21/540b lim: 40 exec/s: 26 rss: 69Mb L: 9/38 MS: 1 CMP- DE: "\366\377\377\377"- 00:07:05.532 [2024-12-02 15:22:51.126336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffaffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.126363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.126499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.126517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.532 #27 NEW cov: 11807 ft: 14556 corp: 22/556b lim: 40 exec/s: 27 rss: 69Mb L: 16/38 MS: 1 ChangeBinInt- 00:07:05.532 [2024-12-02 15:22:51.166848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.166875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.167016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffefff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.167035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.167177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.167193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.167319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.167335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.532 #28 NEW cov: 11807 ft: 14573 corp: 23/593b lim: 40 exec/s: 28 rss: 69Mb L: 37/38 MS: 1 CopyPart- 00:07:05.532 [2024-12-02 15:22:51.206590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.206615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.206738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff4bff cdw11:ff3dfffb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.206754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.532 #29 NEW cov: 11807 ft: 14591 corp: 24/616b lim: 40 exec/s: 29 rss: 69Mb L: 23/38 MS: 1 ChangeBinInt- 00:07:05.532 [2024-12-02 15:22:51.247164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:858585ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.247191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.247319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.247335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.247461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.247477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.247613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:3dffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.247629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.532 #30 NEW cov: 11807 ft: 14611 corp: 25/650b lim: 40 exec/s: 30 rss: 70Mb L: 34/38 MS: 1 ShuffleBytes- 00:07:05.532 [2024-12-02 15:22:51.287437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:858585ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.532 [2024-12-02 15:22:51.287465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.532 [2024-12-02 15:22:51.287608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.287625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.533 [2024-12-02 15:22:51.287726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffff3d cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.287752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.533 [2024-12-02 15:22:51.287883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.287901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.533 #31 NEW cov: 11807 ft: 14643 corp: 26/684b lim: 40 exec/s: 31 rss: 70Mb L: 34/38 MS: 1 CopyPart- 00:07:05.533 [2024-12-02 15:22:51.337517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.337543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.533 [2024-12-02 15:22:51.337678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.337695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.533 [2024-12-02 15:22:51.337828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.337845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.533 [2024-12-02 15:22:51.337976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.533 [2024-12-02 15:22:51.337991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.793 #32 NEW cov: 11807 ft: 14646 corp: 27/717b lim: 40 exec/s: 32 rss: 70Mb L: 33/38 MS: 1 CrossOver- 00:07:05.793 [2024-12-02 15:22:51.377546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.377573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.377713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.377730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.377866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.377883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.378011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.378030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.793 #33 NEW cov: 11807 ft: 14653 corp: 28/749b lim: 40 exec/s: 33 rss: 70Mb L: 32/38 MS: 1 InsertByte- 00:07:05.793 [2024-12-02 15:22:51.417691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.417718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.417856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.417877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.418016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.418032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.418166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.418183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.793 #34 NEW cov: 11807 ft: 14680 corp: 29/781b lim: 40 exec/s: 34 rss: 70Mb L: 32/38 MS: 1 CrossOver- 00:07:05.793 [2024-12-02 15:22:51.457619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.457646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.457785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffffffa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.457802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.457941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffbdffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.457958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.793 #35 NEW cov: 11807 ft: 14712 corp: 30/812b lim: 40 exec/s: 35 rss: 70Mb L: 31/38 MS: 1 ChangeBit- 00:07:05.793 [2024-12-02 15:22:51.497678] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:7dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.497705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.497847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:faffff4b cdw11:ffff3dff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.497863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.497992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:fbffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.498008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.793 #36 NEW cov: 11807 ft: 14713 corp: 31/836b lim: 40 exec/s: 36 rss: 70Mb L: 24/38 MS: 1 InsertByte- 00:07:05.793 [2024-12-02 15:22:51.537641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.537676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.537812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffefff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.537831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.793 #37 NEW cov: 11807 ft: 14723 corp: 32/856b lim: 40 exec/s: 37 rss: 70Mb L: 20/38 MS: 1 EraseBytes- 00:07:05.793 [2024-12-02 15:22:51.587730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.587764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.793 [2024-12-02 15:22:51.587906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:05.793 [2024-12-02 15:22:51.587923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.793 #38 NEW cov: 11807 ft: 14758 corp: 33/872b lim: 40 exec/s: 38 rss: 70Mb L: 16/38 MS: 1 ShuffleBytes- 00:07:06.053 [2024-12-02 15:22:51.628279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffff0700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.628309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.628440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2cffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.628457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.628591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.628607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.628738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.628755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.053 #39 NEW cov: 11807 ft: 14768 corp: 34/904b lim: 40 exec/s: 39 rss: 70Mb L: 32/38 MS: 1 ChangeBinInt- 00:07:06.053 [2024-12-02 15:22:51.668002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:858585ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.668030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.668156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.668173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.053 #40 NEW cov: 11807 ft: 14783 corp: 35/924b lim: 40 exec/s: 40 rss: 70Mb L: 20/38 MS: 1 EraseBytes- 00:07:06.053 [2024-12-02 15:22:51.717907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a112a51 cdw11:f2ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.717937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.053 #41 NEW cov: 11807 ft: 14792 corp: 36/933b lim: 40 exec/s: 41 rss: 70Mb L: 9/38 MS: 1 ChangeBit- 00:07:06.053 [2024-12-02 15:22:51.768334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.768361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.768488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7dffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.768506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.053 #42 NEW cov: 11807 ft: 14810 corp: 37/950b lim: 40 exec/s: 42 rss: 70Mb L: 17/38 MS: 1 CrossOver- 00:07:06.053 [2024-12-02 15:22:51.808699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.808725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.808863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:2bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.808881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.809011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.809027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.809160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.809176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.848893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffefff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.848920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.849047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.849063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.849192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.849208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.053 [2024-12-02 15:22:51.849339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:fffffffb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.053 [2024-12-02 15:22:51.849355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.053 #44 NEW cov: 11807 ft: 14849 corp: 38/983b lim: 40 exec/s: 44 rss: 70Mb L: 33/38 MS: 2 InsertByte-CopyPart- 00:07:06.313 [2024-12-02 15:22:51.888863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.313 [2024-12-02 15:22:51.888890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.313 [2024-12-02 15:22:51.889025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fffeffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.313 [2024-12-02 15:22:51.889042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.313 [2024-12-02 15:22:51.889169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.313 [2024-12-02 15:22:51.889187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.313 #45 NEW cov: 11807 ft: 14857 corp: 39/1014b lim: 40 exec/s: 45 rss: 70Mb L: 31/38 MS: 1 ChangeBit- 00:07:06.313 [2024-12-02 15:22:51.929337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.313 [2024-12-02 15:22:51.929363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.313 [2024-12-02 15:22:51.929489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.313 [2024-12-02 15:22:51.929505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.313 [2024-12-02 15:22:51.929638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff3dffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.314 [2024-12-02 15:22:51.929654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.314 [2024-12-02 15:22:51.929781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.314 [2024-12-02 15:22:51.929796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.314 [2024-12-02 15:22:51.929920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:06.314 [2024-12-02 15:22:51.929935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:06.314 #46 NEW cov: 11807 ft: 14901 corp: 40/1054b lim: 40 exec/s: 23 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:06.314 #46 DONE cov: 11807 ft: 14901 corp: 40/1054b lim: 40 exec/s: 23 rss: 70Mb 00:07:06.314 ###### Recommended dictionary. ###### 00:07:06.314 "\021*)Q\263K\226\000" # Uses: 0 00:07:06.314 "\366\377\377\377" # Uses: 0 00:07:06.314 ###### End of recommended dictionary. ###### 00:07:06.314 Done 46 runs in 2 second(s) 00:07:06.314 15:22:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:06.314 15:22:52 -- ../common.sh@72 -- # (( i++ )) 00:07:06.314 15:22:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:06.314 15:22:52 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:06.314 15:22:52 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:06.314 15:22:52 -- nvmf/run.sh@24 -- # local timen=1 00:07:06.314 15:22:52 -- nvmf/run.sh@25 -- # local core=0x1 00:07:06.314 15:22:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:06.314 15:22:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:06.314 15:22:52 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:06.314 15:22:52 -- nvmf/run.sh@29 -- # port=4411 00:07:06.314 15:22:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:06.314 15:22:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:06.314 15:22:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:06.314 15:22:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:06.314 [2024-12-02 15:22:52.113592] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:06.314 [2024-12-02 15:22:52.113655] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1600317 ] 00:07:06.573 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.573 [2024-12-02 15:22:52.359297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.832 [2024-12-02 15:22:52.446494] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:06.832 [2024-12-02 15:22:52.446612] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.832 [2024-12-02 15:22:52.504253] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:06.832 [2024-12-02 15:22:52.520573] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:06.832 INFO: Running with entropic power schedule (0xFF, 100). 00:07:06.832 INFO: Seed: 4106930739 00:07:06.832 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:06.832 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:06.832 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:06.832 INFO: A corpus is not provided, starting from an empty corpus 00:07:06.832 #2 INITED exec/s: 0 rss: 60Mb 00:07:06.832 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:06.832 This may also happen if the target rejected all inputs we tried so far 00:07:06.832 [2024-12-02 15:22:52.576138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.832 [2024-12-02 15:22:52.576168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.832 [2024-12-02 15:22:52.576232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.832 [2024-12-02 15:22:52.576247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.832 [2024-12-02 15:22:52.576308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.832 [2024-12-02 15:22:52.576323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.092 NEW_FUNC[1/671]: 0x4493f8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:07.092 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:07.092 #3 NEW cov: 11592 ft: 11592 corp: 2/26b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:07.092 [2024-12-02 15:22:52.896723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.092 [2024-12-02 15:22:52.896790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.351 #11 NEW cov: 11705 ft: 13035 corp: 3/35b lim: 40 exec/s: 0 rss: 68Mb L: 9/25 MS: 3 ChangeBinInt-CrossOver-CrossOver- 00:07:07.352 [2024-12-02 15:22:52.946942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:52.946968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:52.947025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:52.947038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:52.947097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:52.947114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.352 #12 NEW cov: 11711 ft: 13197 corp: 4/60b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ShuffleBytes- 00:07:07.352 [2024-12-02 15:22:52.987031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:52.987058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:52.987117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b00 cdw11:0000192b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:52.987130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:52.987187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:52.987200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.352 #13 NEW cov: 11796 ft: 13535 corp: 5/85b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeBinInt- 00:07:07.352 [2024-12-02 15:22:53.027167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.027191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:53.027266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b00 cdw11:0000192b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.027279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:53.027334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b000000 cdw11:192b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.027346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.352 #19 NEW cov: 11796 ft: 13599 corp: 6/110b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 CopyPart- 00:07:07.352 [2024-12-02 15:22:53.067275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.067301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:53.067357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.067370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:53.067426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:032b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.067439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.352 #20 NEW cov: 11796 ft: 13639 corp: 7/135b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeByte- 00:07:07.352 [2024-12-02 15:22:53.107032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.107057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.352 #21 NEW cov: 11796 ft: 13785 corp: 8/144b lim: 40 exec/s: 0 rss: 69Mb L: 9/25 MS: 1 ChangeBit- 00:07:07.352 [2024-12-02 15:22:53.147285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.147310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.352 [2024-12-02 15:22:53.147368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:192b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.352 [2024-12-02 15:22:53.147381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.352 #22 NEW cov: 11796 ft: 14052 corp: 9/163b lim: 40 exec/s: 0 rss: 69Mb L: 19/25 MS: 1 EraseBytes- 00:07:07.611 [2024-12-02 15:22:53.187537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:ab2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.187562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 [2024-12-02 15:22:53.187619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.187633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.611 [2024-12-02 15:22:53.187690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.187704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.611 #23 NEW cov: 11796 ft: 14085 corp: 10/188b lim: 40 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 ChangeBit- 00:07:07.611 [2024-12-02 15:22:53.227532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.227557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 [2024-12-02 15:22:53.227614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:192b2b2b cdw11:2b2b2b3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.227628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.611 #24 NEW cov: 11796 ft: 14135 corp: 11/207b lim: 40 exec/s: 0 rss: 69Mb L: 19/25 MS: 1 ChangeBit- 00:07:07.611 [2024-12-02 15:22:53.267658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2bcd2b2b cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.267683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 [2024-12-02 15:22:53.267744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:192b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.267758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.611 #25 NEW cov: 11796 ft: 14157 corp: 12/226b lim: 40 exec/s: 0 rss: 69Mb L: 19/25 MS: 1 ChangeByte- 00:07:07.611 [2024-12-02 15:22:53.307631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.307655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 #26 NEW cov: 11796 ft: 14219 corp: 13/235b lim: 40 exec/s: 0 rss: 69Mb L: 9/25 MS: 1 ShuffleBytes- 00:07:07.611 [2024-12-02 15:22:53.347893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.347919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 [2024-12-02 15:22:53.347977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.347990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.611 #27 NEW cov: 11796 ft: 14235 corp: 14/254b lim: 40 exec/s: 0 rss: 69Mb L: 19/25 MS: 1 CrossOver- 00:07:07.611 [2024-12-02 15:22:53.387887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.387911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 #28 NEW cov: 11796 ft: 14257 corp: 15/263b lim: 40 exec/s: 0 rss: 69Mb L: 9/25 MS: 1 CopyPart- 00:07:07.611 [2024-12-02 15:22:53.428164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:26000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.428189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.611 [2024-12-02 15:22:53.428248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:192b2b2b cdw11:2b2b2b3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.611 [2024-12-02 15:22:53.428262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.871 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:07.871 #29 NEW cov: 11819 ft: 14298 corp: 16/282b lim: 40 exec/s: 0 rss: 69Mb L: 19/25 MS: 1 ChangeBinInt- 00:07:07.871 [2024-12-02 15:22:53.468095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b762b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.871 [2024-12-02 15:22:53.468119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.871 #30 NEW cov: 11819 ft: 14348 corp: 17/291b lim: 40 exec/s: 0 rss: 70Mb L: 9/25 MS: 1 ChangeByte- 00:07:07.871 [2024-12-02 15:22:53.508199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b762b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.508223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.872 #31 NEW cov: 11819 ft: 14389 corp: 18/300b lim: 40 exec/s: 0 rss: 70Mb L: 9/25 MS: 1 ShuffleBytes- 00:07:07.872 [2024-12-02 15:22:53.548307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.548332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.872 #32 NEW cov: 11819 ft: 14399 corp: 19/313b lim: 40 exec/s: 32 rss: 70Mb L: 13/25 MS: 1 EraseBytes- 00:07:07.872 [2024-12-02 15:22:53.588576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b332b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.588601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.872 [2024-12-02 15:22:53.588661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.588674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.872 #33 NEW cov: 11819 ft: 14412 corp: 20/333b lim: 40 exec/s: 33 rss: 70Mb L: 20/25 MS: 1 InsertByte- 00:07:07.872 [2024-12-02 15:22:53.628868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:ab2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.628892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.872 [2024-12-02 15:22:53.628950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.628964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.872 [2024-12-02 15:22:53.629022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b3b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.629035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.872 #34 NEW cov: 11819 ft: 14458 corp: 21/358b lim: 40 exec/s: 34 rss: 70Mb L: 25/25 MS: 1 ChangeBit- 00:07:07.872 [2024-12-02 15:22:53.668659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b282b cdw11:282b4d49 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.872 [2024-12-02 15:22:53.668684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 #39 NEW cov: 11819 ft: 14553 corp: 22/373b lim: 40 exec/s: 39 rss: 70Mb L: 15/25 MS: 5 CrossOver-CopyPart-ChangeByte-CopyPart-CMP- DE: "MI\036\002\000\000\000\000"- 00:07:08.131 [2024-12-02 15:22:53.709227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2bffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.709252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.709308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.709321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.709391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.709405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.709462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.709475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.131 #40 NEW cov: 11819 ft: 14897 corp: 23/406b lim: 40 exec/s: 40 rss: 70Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:08.131 [2024-12-02 15:22:53.748878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:232b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.748903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 #41 NEW cov: 11819 ft: 14903 corp: 24/415b lim: 40 exec/s: 41 rss: 70Mb L: 9/33 MS: 1 ChangeBit- 00:07:08.131 [2024-12-02 15:22:53.789000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a4d491e cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.789025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 #42 NEW cov: 11819 ft: 14916 corp: 25/424b lim: 40 exec/s: 42 rss: 70Mb L: 9/33 MS: 1 PersAutoDict- DE: "MI\036\002\000\000\000\000"- 00:07:08.131 [2024-12-02 15:22:53.829577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:ab2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.829603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.829662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2bab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.829675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.829731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.829750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.829807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b3b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.829820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.131 #43 NEW cov: 11819 ft: 14927 corp: 26/459b lim: 40 exec/s: 43 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:08.131 [2024-12-02 15:22:53.869241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc4d491e cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.869267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 #46 NEW cov: 11819 ft: 14989 corp: 27/473b lim: 40 exec/s: 46 rss: 70Mb L: 14/35 MS: 3 EraseBytes-EraseBytes-PersAutoDict- DE: "MI\036\002\000\000\000\000"- 00:07:08.131 [2024-12-02 15:22:53.909525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc4d491e cdw11:4d491e02 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.909550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.909608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b020000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.909621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.131 #47 NEW cov: 11819 ft: 15002 corp: 28/496b lim: 40 exec/s: 47 rss: 70Mb L: 23/35 MS: 1 CopyPart- 00:07:08.131 [2024-12-02 15:22:53.949684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.949710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.131 [2024-12-02 15:22:53.949771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2f2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.131 [2024-12-02 15:22:53.949785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.391 #48 NEW cov: 11819 ft: 15025 corp: 29/513b lim: 40 exec/s: 48 rss: 70Mb L: 17/35 MS: 1 CopyPart- 00:07:08.391 [2024-12-02 15:22:53.989915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:ab2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:53.989941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:53.990000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:53.990016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:53.990089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b00002b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:53.990102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.391 #49 NEW cov: 11819 ft: 15035 corp: 30/538b lim: 40 exec/s: 49 rss: 70Mb L: 25/35 MS: 1 CrossOver- 00:07:08.391 [2024-12-02 15:22:54.030176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:002b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.030200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.030257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2bab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.030270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.030327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.030340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.030393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:2b2b2b2b cdw11:2b3b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.030406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.391 #50 NEW cov: 11819 ft: 15049 corp: 31/573b lim: 40 exec/s: 50 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:07:08.391 [2024-12-02 15:22:54.069836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.069861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.391 #51 NEW cov: 11819 ft: 15057 corp: 32/582b lim: 40 exec/s: 51 rss: 70Mb L: 9/35 MS: 1 ShuffleBytes- 00:07:08.391 [2024-12-02 15:22:54.110404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2bfc cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.110430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.110487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.110500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.110557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.110570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.110625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:2b2b2b03 cdw11:2b2b2b0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.110638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.391 #52 NEW cov: 11819 ft: 15067 corp: 33/615b lim: 40 exec/s: 52 rss: 70Mb L: 33/35 MS: 1 CrossOver- 00:07:08.391 [2024-12-02 15:22:54.150064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a4d491e cdw11:02000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.150088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.391 #53 NEW cov: 11819 ft: 15077 corp: 34/624b lim: 40 exec/s: 53 rss: 70Mb L: 9/35 MS: 1 ChangeBinInt- 00:07:08.391 [2024-12-02 15:22:54.190499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.190523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.190582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b322b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.190595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.391 [2024-12-02 15:22:54.190653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b032b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.391 [2024-12-02 15:22:54.190666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.651 #54 NEW cov: 11819 ft: 15087 corp: 35/650b lim: 40 exec/s: 54 rss: 70Mb L: 26/35 MS: 1 InsertByte- 00:07:08.651 [2024-12-02 15:22:54.230463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc4d491e cdw11:4d491e02 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.230487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.230544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:2b02004d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.230558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.651 #55 NEW cov: 11819 ft: 15103 corp: 36/673b lim: 40 exec/s: 55 rss: 70Mb L: 23/35 MS: 1 PersAutoDict- DE: "MI\036\002\000\000\000\000"- 00:07:08.651 [2024-12-02 15:22:54.270601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:ab2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.270625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.270684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2bab SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.270697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.651 #56 NEW cov: 11819 ft: 15145 corp: 37/693b lim: 40 exec/s: 56 rss: 70Mb L: 20/35 MS: 1 EraseBytes- 00:07:08.651 [2024-12-02 15:22:54.310998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:56565656 cdw11:56565656 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.311023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.311080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:56565656 cdw11:56565656 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.311094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.311154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:56565656 cdw11:56565656 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.311170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.311228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:56565656 cdw11:56565656 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.311241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.651 #57 NEW cov: 11819 ft: 15151 corp: 38/730b lim: 40 exec/s: 57 rss: 70Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:08.651 [2024-12-02 15:22:54.351127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2bffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.351153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.351210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.351224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.351282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b2b cdw11:2b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.351295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.651 [2024-12-02 15:22:54.351352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:192b2b00 cdw11:0000192b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.351365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.651 #58 NEW cov: 11819 ft: 15175 corp: 39/765b lim: 40 exec/s: 58 rss: 70Mb L: 35/37 MS: 1 InsertRepeatedBytes- 00:07:08.651 [2024-12-02 15:22:54.390807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:6a762b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.390833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.651 #59 NEW cov: 11819 ft: 15186 corp: 40/774b lim: 40 exec/s: 59 rss: 70Mb L: 9/37 MS: 1 ChangeByte- 00:07:08.651 [2024-12-02 15:22:54.430887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:fc2b2b23 cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.651 [2024-12-02 15:22:54.430913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.651 #60 NEW cov: 11819 ft: 15216 corp: 41/783b lim: 40 exec/s: 60 rss: 70Mb L: 9/37 MS: 1 ChangeByte- 00:07:08.912 [2024-12-02 15:22:54.471012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.912 [2024-12-02 15:22:54.471037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.912 #61 NEW cov: 11819 ft: 15303 corp: 42/792b lim: 40 exec/s: 61 rss: 70Mb L: 9/37 MS: 1 ShuffleBytes- 00:07:08.912 [2024-12-02 15:22:54.511384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2b2b2b2b cdw11:ab2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.912 [2024-12-02 15:22:54.511409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.912 [2024-12-02 15:22:54.511466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2b2b2b2b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.912 [2024-12-02 15:22:54.511479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.912 [2024-12-02 15:22:54.511535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2b2b2b3b cdw11:2b2b2b2b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.912 [2024-12-02 15:22:54.511549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.912 #62 NEW cov: 11819 ft: 15318 corp: 43/817b lim: 40 exec/s: 62 rss: 70Mb L: 25/37 MS: 1 ChangeByte- 00:07:08.912 [2024-12-02 15:22:54.551217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a4d491e cdw11:0200f802 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:08.912 [2024-12-02 15:22:54.551241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.912 #63 NEW cov: 11819 ft: 15335 corp: 44/826b lim: 40 exec/s: 31 rss: 70Mb L: 9/37 MS: 1 ChangeBinInt- 00:07:08.912 #63 DONE cov: 11819 ft: 15335 corp: 44/826b lim: 40 exec/s: 31 rss: 70Mb 00:07:08.912 ###### Recommended dictionary. ###### 00:07:08.912 "MI\036\002\000\000\000\000" # Uses: 3 00:07:08.912 ###### End of recommended dictionary. ###### 00:07:08.912 Done 63 runs in 2 second(s) 00:07:08.912 15:22:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:08.912 15:22:54 -- ../common.sh@72 -- # (( i++ )) 00:07:08.912 15:22:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:08.912 15:22:54 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:08.912 15:22:54 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:08.912 15:22:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:08.912 15:22:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:08.912 15:22:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:08.912 15:22:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:08.912 15:22:54 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:08.912 15:22:54 -- nvmf/run.sh@29 -- # port=4412 00:07:08.912 15:22:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:08.912 15:22:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:08.912 15:22:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:08.912 15:22:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:09.171 [2024-12-02 15:22:54.743997] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:09.171 [2024-12-02 15:22:54.744066] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1600800 ] 00:07:09.171 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.428 [2024-12-02 15:22:54.993387] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.428 [2024-12-02 15:22:55.077226] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:09.428 [2024-12-02 15:22:55.077350] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.428 [2024-12-02 15:22:55.135204] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:09.428 [2024-12-02 15:22:55.151515] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:09.428 INFO: Running with entropic power schedule (0xFF, 100). 00:07:09.428 INFO: Seed: 2440126867 00:07:09.428 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:09.428 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:09.428 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:09.429 INFO: A corpus is not provided, starting from an empty corpus 00:07:09.429 #2 INITED exec/s: 0 rss: 61Mb 00:07:09.429 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:09.429 This may also happen if the target rejected all inputs we tried so far 00:07:09.429 [2024-12-02 15:22:55.199126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a00964b cdw11:b60d1a3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.429 [2024-12-02 15:22:55.199153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.687 NEW_FUNC[1/671]: 0x44b168 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:09.687 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:09.687 #4 NEW cov: 11590 ft: 11591 corp: 2/10b lim: 40 exec/s: 0 rss: 68Mb L: 9/9 MS: 2 ChangeBit-CMP- DE: "\000\226K\266\015\032?T"- 00:07:09.687 [2024-12-02 15:22:55.500519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.687 [2024-12-02 15:22:55.500557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.687 [2024-12-02 15:22:55.500626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.687 [2024-12-02 15:22:55.500644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.687 [2024-12-02 15:22:55.500710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.687 [2024-12-02 15:22:55.500727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.687 [2024-12-02 15:22:55.500800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.687 [2024-12-02 15:22:55.500817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.946 #5 NEW cov: 11703 ft: 12765 corp: 3/45b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:09.946 [2024-12-02 15:22:55.540037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7e0096 cdw11:4bb60d1a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.540062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.946 #8 NEW cov: 11709 ft: 13117 corp: 4/55b lim: 40 exec/s: 0 rss: 68Mb L: 10/35 MS: 3 InsertByte-ChangeByte-PersAutoDict- DE: "\000\226K\266\015\032?T"- 00:07:09.946 [2024-12-02 15:22:55.580132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00964bb6 cdw11:0d1a3f54 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.580157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.946 #10 NEW cov: 11794 ft: 13405 corp: 5/65b lim: 40 exec/s: 0 rss: 68Mb L: 10/35 MS: 2 CrossOver-PersAutoDict- DE: "\000\226K\266\015\032?T"- 00:07:09.946 [2024-12-02 15:22:55.620748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.620773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.620831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.620845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.620920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.620935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.620993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.621006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.946 #11 NEW cov: 11794 ft: 13601 corp: 6/100b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:07:09.946 [2024-12-02 15:22:55.670431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7e0096 cdw11:4b23b60d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.670457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.946 #12 NEW cov: 11794 ft: 13688 corp: 7/111b lim: 40 exec/s: 0 rss: 68Mb L: 11/35 MS: 1 InsertByte- 00:07:09.946 [2024-12-02 15:22:55.710705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.710730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.710809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.710823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.946 #14 NEW cov: 11794 ft: 13968 corp: 8/131b lim: 40 exec/s: 0 rss: 68Mb L: 20/35 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:09.946 [2024-12-02 15:22:55.751123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:964bb60d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.751147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.751222] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:1a3f5400 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.751236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.751295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.751308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.946 [2024-12-02 15:22:55.751368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:09.946 [2024-12-02 15:22:55.751381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.205 #15 NEW cov: 11794 ft: 14029 corp: 9/166b lim: 40 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 PersAutoDict- DE: "\000\226K\266\015\032?T"- 00:07:10.205 [2024-12-02 15:22:55.790761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a020000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.205 [2024-12-02 15:22:55.790785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.205 #20 NEW cov: 11794 ft: 14089 corp: 10/175b lim: 40 exec/s: 0 rss: 68Mb L: 9/35 MS: 5 ChangeBit-ChangeByte-ChangeByte-ChangeByte-PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:10.205 [2024-12-02 15:22:55.821031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:fffffdff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.205 [2024-12-02 15:22:55.821056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.205 [2024-12-02 15:22:55.821114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.205 [2024-12-02 15:22:55.821128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.205 #21 NEW cov: 11794 ft: 14135 corp: 11/195b lim: 40 exec/s: 0 rss: 68Mb L: 20/35 MS: 1 ChangeBit- 00:07:10.205 [2024-12-02 15:22:55.861174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.205 [2024-12-02 15:22:55.861199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.205 [2024-12-02 15:22:55.861275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff27ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.205 [2024-12-02 15:22:55.861289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.205 #22 NEW cov: 11794 ft: 14199 corp: 12/216b lim: 40 exec/s: 0 rss: 68Mb L: 21/35 MS: 1 InsertByte- 00:07:10.205 [2024-12-02 15:22:55.901568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.205 [2024-12-02 15:22:55.901594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.205 [2024-12-02 15:22:55.901668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:55.901682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.206 [2024-12-02 15:22:55.901743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:55.901757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.206 [2024-12-02 15:22:55.901815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:55.901829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.206 #23 NEW cov: 11794 ft: 14249 corp: 13/255b lim: 40 exec/s: 0 rss: 68Mb L: 39/39 MS: 1 CopyPart- 00:07:10.206 [2024-12-02 15:22:55.951248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:55.951273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.206 #24 NEW cov: 11794 ft: 14263 corp: 14/264b lim: 40 exec/s: 0 rss: 68Mb L: 9/39 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:07:10.206 [2024-12-02 15:22:55.981455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:55.981479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.206 [2024-12-02 15:22:55.981555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:55.981574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.206 #25 NEW cov: 11794 ft: 14318 corp: 15/285b lim: 40 exec/s: 0 rss: 69Mb L: 21/39 MS: 1 InsertByte- 00:07:10.206 [2024-12-02 15:22:56.021612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:56.021637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.206 [2024-12-02 15:22:56.021699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff2800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.206 [2024-12-02 15:22:56.021713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.465 #26 NEW cov: 11794 ft: 14428 corp: 16/306b lim: 40 exec/s: 0 rss: 69Mb L: 21/39 MS: 1 ChangeBinInt- 00:07:10.465 [2024-12-02 15:22:56.062263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.062288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.062348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.062362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.062421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.062434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.062492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.062505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.062562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:ffff0000 cdw11:0000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.062575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:10.465 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:10.465 #27 NEW cov: 11817 ft: 14532 corp: 17/346b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:10.465 [2024-12-02 15:22:56.101835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.101860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.101920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff65ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.101933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.465 #28 NEW cov: 11817 ft: 14561 corp: 18/366b lim: 40 exec/s: 0 rss: 69Mb L: 20/40 MS: 1 ChangeByte- 00:07:10.465 [2024-12-02 15:22:56.142308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.142334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.142397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.142411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.142469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.142482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.465 [2024-12-02 15:22:56.142543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.142557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.465 #29 NEW cov: 11817 ft: 14577 corp: 19/405b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 ChangeBit- 00:07:10.465 [2024-12-02 15:22:56.181939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a00964b cdw11:b60d1a3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.181965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.465 #30 NEW cov: 11817 ft: 14625 corp: 20/414b lim: 40 exec/s: 30 rss: 69Mb L: 9/40 MS: 1 ShuffleBytes- 00:07:10.465 [2024-12-02 15:22:56.222045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a00964b cdw11:4af2e5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.222070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.465 #31 NEW cov: 11817 ft: 14637 corp: 21/423b lim: 40 exec/s: 31 rss: 69Mb L: 9/40 MS: 1 ChangeBinInt- 00:07:10.465 [2024-12-02 15:22:56.262649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.465 [2024-12-02 15:22:56.262674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.466 [2024-12-02 15:22:56.262737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.466 [2024-12-02 15:22:56.262751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.466 [2024-12-02 15:22:56.262809] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.466 [2024-12-02 15:22:56.262822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.466 [2024-12-02 15:22:56.262881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00964bb6 cdw11:0d1a3f54 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.466 [2024-12-02 15:22:56.262894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.466 #32 NEW cov: 11817 ft: 14706 corp: 22/458b lim: 40 exec/s: 32 rss: 69Mb L: 35/40 MS: 1 PersAutoDict- DE: "\000\226K\266\015\032?T"- 00:07:10.725 [2024-12-02 15:22:56.302433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.302458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.302531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff7128 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.302549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.725 #33 NEW cov: 11817 ft: 14740 corp: 23/480b lim: 40 exec/s: 33 rss: 69Mb L: 22/40 MS: 1 InsertByte- 00:07:10.725 [2024-12-02 15:22:56.342925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.342950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.343026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f5000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.343041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.343099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.343113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.343170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.343183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.725 #34 NEW cov: 11817 ft: 14752 corp: 24/515b lim: 40 exec/s: 34 rss: 69Mb L: 35/40 MS: 1 ChangeByte- 00:07:10.725 [2024-12-02 15:22:56.382617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a020000 cdw11:00008383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.382642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.382702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:83838383 cdw11:83838300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.382716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.725 #35 NEW cov: 11817 ft: 14759 corp: 25/533b lim: 40 exec/s: 35 rss: 69Mb L: 18/40 MS: 1 InsertRepeatedBytes- 00:07:10.725 [2024-12-02 15:22:56.423125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:96000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.423151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.423211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.423226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.423268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.423282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.725 [2024-12-02 15:22:56.423359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0000964b cdw11:b60d1a3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.423375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.725 #36 NEW cov: 11817 ft: 14773 corp: 26/569b lim: 40 exec/s: 36 rss: 69Mb L: 36/40 MS: 1 CrossOver- 00:07:10.725 [2024-12-02 15:22:56.472739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a4b96b6 cdw11:000d1a3f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.725 [2024-12-02 15:22:56.472765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.726 #37 NEW cov: 11817 ft: 14775 corp: 27/578b lim: 40 exec/s: 37 rss: 69Mb L: 9/40 MS: 1 ShuffleBytes- 00:07:10.726 [2024-12-02 15:22:56.513034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000fe cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.726 [2024-12-02 15:22:56.513059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.726 [2024-12-02 15:22:56.513119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.726 [2024-12-02 15:22:56.513133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.726 #38 NEW cov: 11817 ft: 14783 corp: 28/598b lim: 40 exec/s: 38 rss: 69Mb L: 20/40 MS: 1 ChangeBinInt- 00:07:10.985 [2024-12-02 15:22:56.553474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7e0096 cdw11:4b230000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.553499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.553558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:02000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.553572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.553641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.553655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.553711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.553724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.985 #39 NEW cov: 11817 ft: 14888 corp: 29/636b lim: 40 exec/s: 39 rss: 69Mb L: 38/40 MS: 1 CrossOver- 00:07:10.985 [2024-12-02 15:22:56.603144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a00964a cdw11:4bf2e5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.603169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.985 #40 NEW cov: 11817 ft: 14926 corp: 30/645b lim: 40 exec/s: 40 rss: 70Mb L: 9/40 MS: 1 ShuffleBytes- 00:07:10.985 [2024-12-02 15:22:56.643423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.643447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.643523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:c0ff7128 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.643538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.985 #41 NEW cov: 11817 ft: 14946 corp: 31/667b lim: 40 exec/s: 41 rss: 70Mb L: 22/40 MS: 1 ChangeByte- 00:07:10.985 [2024-12-02 15:22:56.683341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00964bb6 cdw11:0d1a3f00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.683369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.985 #42 NEW cov: 11817 ft: 14951 corp: 32/681b lim: 40 exec/s: 42 rss: 70Mb L: 14/40 MS: 1 CrossOver- 00:07:10.985 [2024-12-02 15:22:56.723502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7e0096 cdw11:7e23b60d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.723528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.985 #43 NEW cov: 11817 ft: 14955 corp: 33/692b lim: 40 exec/s: 43 rss: 70Mb L: 11/40 MS: 1 CopyPart- 00:07:10.985 [2024-12-02 15:22:56.764111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.764136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.764211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.764225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.764287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.764300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.985 [2024-12-02 15:22:56.764358] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:10.985 [2024-12-02 15:22:56.764371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.985 #44 NEW cov: 11817 ft: 14963 corp: 34/730b lim: 40 exec/s: 44 rss: 70Mb L: 38/40 MS: 1 EraseBytes- 00:07:11.245 [2024-12-02 15:22:56.804157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a7e0096 cdw11:4bb60dff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.245 [2024-12-02 15:22:56.804183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.245 [2024-12-02 15:22:56.804245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.245 [2024-12-02 15:22:56.804259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.245 [2024-12-02 15:22:56.804319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.245 [2024-12-02 15:22:56.804333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.245 [2024-12-02 15:22:56.804392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.245 [2024-12-02 15:22:56.804405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.245 #45 NEW cov: 11817 ft: 14970 corp: 35/769b lim: 40 exec/s: 45 rss: 70Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:07:11.245 [2024-12-02 15:22:56.844253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.844278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.844344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.844357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.844416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.844429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.844487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00964bb6 cdw11:0d1a3f54 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.844500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.246 #46 NEW cov: 11817 ft: 14971 corp: 36/804b lim: 40 exec/s: 46 rss: 70Mb L: 35/40 MS: 1 ChangeByte- 00:07:11.246 [2024-12-02 15:22:56.884070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.884095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.884171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:fff72800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.884184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.246 #47 NEW cov: 11817 ft: 14983 corp: 37/825b lim: 40 exec/s: 47 rss: 70Mb L: 21/40 MS: 1 ChangeBit- 00:07:11.246 [2024-12-02 15:22:56.924476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.924501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.924562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.924576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.924634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.924648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:56.924706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00964bb6 cdw11:0d1a3f54 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.924718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.246 #48 NEW cov: 11817 ft: 15003 corp: 38/860b lim: 40 exec/s: 48 rss: 70Mb L: 35/40 MS: 1 ChangeBinInt- 00:07:11.246 [2024-12-02 15:22:56.964159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:1a00964b cdw11:4a2fe5c5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:56.964184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.246 #49 NEW cov: 11817 ft: 15052 corp: 39/869b lim: 40 exec/s: 49 rss: 70Mb L: 9/40 MS: 1 ChangeByte- 00:07:11.246 [2024-12-02 15:22:57.004255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:57.004283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.246 #50 NEW cov: 11817 ft: 15069 corp: 40/880b lim: 40 exec/s: 50 rss: 70Mb L: 11/40 MS: 1 CrossOver- 00:07:11.246 [2024-12-02 15:22:57.044494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:1bffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:57.044520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.246 [2024-12-02 15:22:57.044579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff65ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.246 [2024-12-02 15:22:57.044593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.505 #51 NEW cov: 11817 ft: 15085 corp: 41/900b lim: 40 exec/s: 51 rss: 70Mb L: 20/40 MS: 1 ChangeByte- 00:07:11.505 [2024-12-02 15:22:57.085012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2a020000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.085037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.085097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.085110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.085167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.085181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.085241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.085255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.505 #52 NEW cov: 11817 ft: 15108 corp: 42/938b lim: 40 exec/s: 52 rss: 70Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:07:11.505 [2024-12-02 15:22:57.124749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:000000ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.124774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.124853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:00964bb6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.124867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.505 #53 NEW cov: 11817 ft: 15113 corp: 43/959b lim: 40 exec/s: 53 rss: 70Mb L: 21/40 MS: 1 PersAutoDict- DE: "\000\226K\266\015\032?T"- 00:07:11.505 [2024-12-02 15:22:57.165204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.165229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.165303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.165317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.165375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.165391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:11.505 [2024-12-02 15:22:57.165451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.505 [2024-12-02 15:22:57.165464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:11.505 #54 NEW cov: 11817 ft: 15123 corp: 44/997b lim: 40 exec/s: 27 rss: 70Mb L: 38/40 MS: 1 ChangeBit- 00:07:11.505 #54 DONE cov: 11817 ft: 15123 corp: 44/997b lim: 40 exec/s: 27 rss: 70Mb 00:07:11.505 ###### Recommended dictionary. ###### 00:07:11.505 "\000\226K\266\015\032?T" # Uses: 5 00:07:11.505 "\002\000\000\000\000\000\000\000" # Uses: 2 00:07:11.505 ###### End of recommended dictionary. ###### 00:07:11.505 Done 54 runs in 2 second(s) 00:07:11.505 15:22:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:11.505 15:22:57 -- ../common.sh@72 -- # (( i++ )) 00:07:11.505 15:22:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:11.505 15:22:57 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:11.505 15:22:57 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:11.505 15:22:57 -- nvmf/run.sh@24 -- # local timen=1 00:07:11.505 15:22:57 -- nvmf/run.sh@25 -- # local core=0x1 00:07:11.505 15:22:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:11.505 15:22:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:11.505 15:22:57 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:11.505 15:22:57 -- nvmf/run.sh@29 -- # port=4413 00:07:11.505 15:22:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:11.763 15:22:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:11.763 15:22:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:11.763 15:22:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:11.763 [2024-12-02 15:22:57.361897] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:11.763 [2024-12-02 15:22:57.361986] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1601154 ] 00:07:11.763 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.022 [2024-12-02 15:22:57.615308] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.022 [2024-12-02 15:22:57.697623] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:12.022 [2024-12-02 15:22:57.697768] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.022 [2024-12-02 15:22:57.755583] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:12.022 [2024-12-02 15:22:57.771914] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:12.022 INFO: Running with entropic power schedule (0xFF, 100). 00:07:12.022 INFO: Seed: 769044267 00:07:12.022 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:12.022 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:12.022 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:12.022 INFO: A corpus is not provided, starting from an empty corpus 00:07:12.022 #2 INITED exec/s: 0 rss: 60Mb 00:07:12.022 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:12.022 This may also happen if the target rejected all inputs we tried so far 00:07:12.022 [2024-12-02 15:22:57.827132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.022 [2024-12-02 15:22:57.827160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.541 NEW_FUNC[1/670]: 0x44cd38 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:12.541 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:12.541 #5 NEW cov: 11571 ft: 11579 corp: 2/13b lim: 40 exec/s: 0 rss: 68Mb L: 12/12 MS: 3 ChangeBinInt-ShuffleBytes-InsertRepeatedBytes- 00:07:12.541 [2024-12-02 15:22:58.149004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.149054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.149210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.149232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.149362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.149383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.541 #6 NEW cov: 11691 ft: 12686 corp: 3/38b lim: 40 exec/s: 0 rss: 68Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:12.541 [2024-12-02 15:22:58.199240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:807c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.199270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.199405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.199422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.199544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.199560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.199683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.199700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.541 #9 NEW cov: 11697 ft: 13282 corp: 4/75b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:07:12.541 [2024-12-02 15:22:58.239348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:807c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.239377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.239518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.239534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.239657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.239676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.239797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.239813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.541 #10 NEW cov: 11782 ft: 13589 corp: 5/113b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 CopyPart- 00:07:12.541 [2024-12-02 15:22:58.289273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.289300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.541 [2024-12-02 15:22:58.289428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.541 [2024-12-02 15:22:58.289445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.542 [2024-12-02 15:22:58.289563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:35000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.542 [2024-12-02 15:22:58.289579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.542 #11 NEW cov: 11782 ft: 13702 corp: 6/138b lim: 40 exec/s: 0 rss: 68Mb L: 25/38 MS: 1 ChangeASCIIInt- 00:07:12.542 [2024-12-02 15:22:58.339681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:807c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.542 [2024-12-02 15:22:58.339707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.542 [2024-12-02 15:22:58.339838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.542 [2024-12-02 15:22:58.339854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.542 [2024-12-02 15:22:58.339973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.542 [2024-12-02 15:22:58.339989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.542 [2024-12-02 15:22:58.340113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7ccc cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.542 [2024-12-02 15:22:58.340128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.542 #12 NEW cov: 11782 ft: 13772 corp: 7/175b lim: 40 exec/s: 0 rss: 68Mb L: 37/38 MS: 1 ChangeByte- 00:07:12.801 [2024-12-02 15:22:58.379162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.379189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.801 #13 NEW cov: 11782 ft: 13893 corp: 8/187b lim: 40 exec/s: 0 rss: 68Mb L: 12/38 MS: 1 CopyPart- 00:07:12.801 [2024-12-02 15:22:58.419469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.419498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.801 [2024-12-02 15:22:58.419621] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39393900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.419638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.801 #14 NEW cov: 11782 ft: 14124 corp: 9/206b lim: 40 exec/s: 0 rss: 68Mb L: 19/38 MS: 1 EraseBytes- 00:07:12.801 [2024-12-02 15:22:58.459638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.459666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.801 [2024-12-02 15:22:58.459784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39397c39 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.459800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.801 #15 NEW cov: 11782 ft: 14144 corp: 10/224b lim: 40 exec/s: 0 rss: 68Mb L: 18/38 MS: 1 CrossOver- 00:07:12.801 [2024-12-02 15:22:58.499557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003939 cdw11:39393900 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.499584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.801 #16 NEW cov: 11782 ft: 14175 corp: 11/234b lim: 40 exec/s: 0 rss: 68Mb L: 10/38 MS: 1 EraseBytes- 00:07:12.801 [2024-12-02 15:22:58.539654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.539680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.801 #17 NEW cov: 11782 ft: 14250 corp: 12/246b lim: 40 exec/s: 0 rss: 68Mb L: 12/38 MS: 1 ChangeBit- 00:07:12.801 [2024-12-02 15:22:58.579793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01001800 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:12.801 [2024-12-02 15:22:58.579819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.801 #18 NEW cov: 11782 ft: 14285 corp: 13/258b lim: 40 exec/s: 0 rss: 68Mb L: 12/38 MS: 1 ChangeByte- 00:07:13.061 [2024-12-02 15:22:58.620380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01010000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.620406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.620527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.620546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.620666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.620683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.061 #19 NEW cov: 11782 ft: 14305 corp: 14/282b lim: 40 exec/s: 0 rss: 68Mb L: 24/38 MS: 1 CrossOver- 00:07:13.061 [2024-12-02 15:22:58.660093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.660123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 #20 NEW cov: 11782 ft: 14317 corp: 15/294b lim: 40 exec/s: 0 rss: 68Mb L: 12/38 MS: 1 CrossOver- 00:07:13.061 [2024-12-02 15:22:58.701048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01010000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.701073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.701197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00010000 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.701221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.701339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0001dbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.701354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.701470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:dbdbdbdb cdw11:dbdbdbdb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.701486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.701613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:dbdb0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.701630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:13.061 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:13.061 #21 NEW cov: 11805 ft: 14403 corp: 16/334b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:13.061 [2024-12-02 15:22:58.750610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.750636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.750762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39387c39 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.750790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.061 #22 NEW cov: 11805 ft: 14487 corp: 17/352b lim: 40 exec/s: 0 rss: 69Mb L: 18/40 MS: 1 ChangeASCIIInt- 00:07:13.061 [2024-12-02 15:22:58.791069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:807c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.791096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.791226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.791242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.791366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.791383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.791511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.791526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.061 #23 NEW cov: 11805 ft: 14505 corp: 18/389b lim: 40 exec/s: 23 rss: 69Mb L: 37/40 MS: 1 CopyPart- 00:07:13.061 [2024-12-02 15:22:58.830751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.830779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.830911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39387c2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.830927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.061 #24 NEW cov: 11805 ft: 14518 corp: 19/407b lim: 40 exec/s: 24 rss: 69Mb L: 18/40 MS: 1 ChangeByte- 00:07:13.061 [2024-12-02 15:22:58.870883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.870910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.061 [2024-12-02 15:22:58.871047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39397c39 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.061 [2024-12-02 15:22:58.871061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.321 #25 NEW cov: 11805 ft: 14556 corp: 20/425b lim: 40 exec/s: 25 rss: 69Mb L: 18/40 MS: 1 ShuffleBytes- 00:07:13.321 [2024-12-02 15:22:58.911350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01001800 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.321 [2024-12-02 15:22:58.911376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.321 [2024-12-02 15:22:58.911500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.321 [2024-12-02 15:22:58.911518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.321 [2024-12-02 15:22:58.911642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.321 [2024-12-02 15:22:58.911657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.321 [2024-12-02 15:22:58.911790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.321 [2024-12-02 15:22:58.911805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.321 #26 NEW cov: 11805 ft: 14572 corp: 21/458b lim: 40 exec/s: 26 rss: 69Mb L: 33/40 MS: 1 CrossOver- 00:07:13.321 [2024-12-02 15:22:58.951206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.321 [2024-12-02 15:22:58.951234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.322 [2024-12-02 15:22:58.951360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39387c39 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:58.951379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.322 #27 NEW cov: 11805 ft: 14577 corp: 22/476b lim: 40 exec/s: 27 rss: 69Mb L: 18/40 MS: 1 ChangeASCIIInt- 00:07:13.322 [2024-12-02 15:22:58.991261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:58.991288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.322 [2024-12-02 15:22:58.991406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39387c2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:58.991423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.322 #28 NEW cov: 11805 ft: 14589 corp: 23/494b lim: 40 exec/s: 28 rss: 69Mb L: 18/40 MS: 1 ShuffleBytes- 00:07:13.322 [2024-12-02 15:22:59.031630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01010000 cdw11:00670000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:59.031658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.322 [2024-12-02 15:22:59.031788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:02000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:59.031805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.322 [2024-12-02 15:22:59.031928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:02000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:59.031944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.322 #29 NEW cov: 11805 ft: 14604 corp: 24/519b lim: 40 exec/s: 29 rss: 69Mb L: 25/40 MS: 1 InsertByte- 00:07:13.322 [2024-12-02 15:22:59.071416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:59.071441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.322 [2024-12-02 15:22:59.071560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:01807c39 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:59.071576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.322 #30 NEW cov: 11805 ft: 14632 corp: 25/539b lim: 40 exec/s: 30 rss: 69Mb L: 20/40 MS: 1 CrossOver- 00:07:13.322 [2024-12-02 15:22:59.111371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000a01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.322 [2024-12-02 15:22:59.111397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.322 #31 NEW cov: 11805 ft: 14653 corp: 26/551b lim: 40 exec/s: 31 rss: 69Mb L: 12/40 MS: 1 CopyPart- 00:07:13.582 [2024-12-02 15:22:59.151488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:03000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.151515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.582 #32 NEW cov: 11805 ft: 14670 corp: 27/563b lim: 40 exec/s: 32 rss: 69Mb L: 12/40 MS: 1 ChangeBit- 00:07:13.582 [2024-12-02 15:22:59.191632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003939 cdw11:39010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.191664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.582 #33 NEW cov: 11805 ft: 14697 corp: 28/573b lim: 40 exec/s: 33 rss: 69Mb L: 10/40 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:13.582 [2024-12-02 15:22:59.242417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:807c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.242444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.242571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.242587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.242686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.242702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.242832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.242850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.582 #34 NEW cov: 11805 ft: 14738 corp: 29/607b lim: 40 exec/s: 34 rss: 69Mb L: 34/40 MS: 1 EraseBytes- 00:07:13.582 [2024-12-02 15:22:59.292414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01010000 cdw11:00670000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.292443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.292580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:02000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.292597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.292730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:02000100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.292751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.582 #35 NEW cov: 11805 ft: 14755 corp: 30/632b lim: 40 exec/s: 35 rss: 70Mb L: 25/40 MS: 1 ChangeBit- 00:07:13.582 [2024-12-02 15:22:59.342071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01001800 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.342098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.582 #36 NEW cov: 11805 ft: 14764 corp: 31/645b lim: 40 exec/s: 36 rss: 70Mb L: 13/40 MS: 1 InsertByte- 00:07:13.582 [2024-12-02 15:22:59.382670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.382697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.382831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffff00 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.382851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.582 [2024-12-02 15:22:59.382980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:39393939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.582 [2024-12-02 15:22:59.382997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.844 #37 NEW cov: 11805 ft: 14781 corp: 32/674b lim: 40 exec/s: 37 rss: 70Mb L: 29/40 MS: 1 InsertRepeatedBytes- 00:07:13.844 [2024-12-02 15:22:59.432373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000200 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.432401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.844 #38 NEW cov: 11805 ft: 14787 corp: 33/686b lim: 40 exec/s: 38 rss: 70Mb L: 12/40 MS: 1 ChangeByte- 00:07:13.844 [2024-12-02 15:22:59.472492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.472518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.844 #39 NEW cov: 11805 ft: 14798 corp: 34/698b lim: 40 exec/s: 39 rss: 70Mb L: 12/40 MS: 1 CopyPart- 00:07:13.844 [2024-12-02 15:22:59.513517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:807c7c7c cdw11:7c7c2e7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.513544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.513675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.513693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.513830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.513849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.513988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.514005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.514125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.514143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:13.844 #40 NEW cov: 11805 ft: 14817 corp: 35/738b lim: 40 exec/s: 40 rss: 70Mb L: 40/40 MS: 1 CrossOver- 00:07:13.844 [2024-12-02 15:22:59.552656] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.552683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.552813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:807c3939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.552832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.844 #41 NEW cov: 11805 ft: 14841 corp: 36/756b lim: 40 exec/s: 41 rss: 70Mb L: 18/40 MS: 1 CopyPart- 00:07:13.844 [2024-12-02 15:22:59.593576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:65807c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.593602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.593737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.593772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.593896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.593912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.594035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:cc7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.594053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:13.844 #43 NEW cov: 11805 ft: 14842 corp: 37/792b lim: 40 exec/s: 43 rss: 70Mb L: 36/40 MS: 2 InsertByte-CrossOver- 00:07:13.844 [2024-12-02 15:22:59.633309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003980 cdw11:01003980 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.633335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.844 [2024-12-02 15:22:59.633480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c393939 cdw11:39393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.844 [2024-12-02 15:22:59.633497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.844 #44 NEW cov: 11805 ft: 14849 corp: 38/810b lim: 40 exec/s: 44 rss: 70Mb L: 18/40 MS: 1 CopyPart- 00:07:14.105 [2024-12-02 15:22:59.673398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01003923 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.105 [2024-12-02 15:22:59.673425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.105 [2024-12-02 15:22:59.673558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39387c2e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.105 [2024-12-02 15:22:59.673575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.105 #45 NEW cov: 11805 ft: 14865 corp: 39/828b lim: 40 exec/s: 45 rss: 70Mb L: 18/40 MS: 1 ChangeByte- 00:07:14.105 [2024-12-02 15:22:59.713506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:7c393939 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.105 [2024-12-02 15:22:59.713533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.105 [2024-12-02 15:22:59.713657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:39393939 cdw11:39397c39 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.105 [2024-12-02 15:22:59.713674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.105 #46 NEW cov: 11805 ft: 14875 corp: 40/846b lim: 40 exec/s: 46 rss: 70Mb L: 18/40 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:14.105 [2024-12-02 15:22:59.753412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:000c000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.105 [2024-12-02 15:22:59.753441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.105 #47 NEW cov: 11805 ft: 14882 corp: 41/858b lim: 40 exec/s: 47 rss: 70Mb L: 12/40 MS: 1 ChangeBinInt- 00:07:14.105 [2024-12-02 15:22:59.793553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:01801800 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.105 [2024-12-02 15:22:59.793579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.105 #48 NEW cov: 11805 ft: 14959 corp: 42/871b lim: 40 exec/s: 24 rss: 70Mb L: 13/40 MS: 1 ChangeBit- 00:07:14.105 #48 DONE cov: 11805 ft: 14959 corp: 42/871b lim: 40 exec/s: 24 rss: 70Mb 00:07:14.105 ###### Recommended dictionary. ###### 00:07:14.105 "\001\000\000\000" # Uses: 1 00:07:14.105 ###### End of recommended dictionary. ###### 00:07:14.105 Done 48 runs in 2 second(s) 00:07:14.366 15:22:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:14.366 15:22:59 -- ../common.sh@72 -- # (( i++ )) 00:07:14.366 15:22:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:14.366 15:22:59 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:14.366 15:22:59 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:14.366 15:22:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:14.366 15:22:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:14.366 15:22:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:14.366 15:22:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:14.366 15:22:59 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:14.366 15:22:59 -- nvmf/run.sh@29 -- # port=4414 00:07:14.366 15:22:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:14.366 15:22:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:14.366 15:22:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:14.366 15:22:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:14.366 [2024-12-02 15:22:59.984360] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:14.366 [2024-12-02 15:22:59.984424] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1601699 ] 00:07:14.366 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.626 [2024-12-02 15:23:00.247213] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.626 [2024-12-02 15:23:00.337238] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:14.626 [2024-12-02 15:23:00.337359] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.626 [2024-12-02 15:23:00.395431] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:14.626 [2024-12-02 15:23:00.411753] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:14.626 INFO: Running with entropic power schedule (0xFF, 100). 00:07:14.626 INFO: Seed: 3409008559 00:07:14.626 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:14.626 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:14.626 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:14.626 INFO: A corpus is not provided, starting from an empty corpus 00:07:14.626 #2 INITED exec/s: 0 rss: 60Mb 00:07:14.626 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:14.626 This may also happen if the target rejected all inputs we tried so far 00:07:14.886 [2024-12-02 15:23:00.457927] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.886 [2024-12-02 15:23:00.457959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.886 [2024-12-02 15:23:00.458025] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.886 [2024-12-02 15:23:00.458042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.886 [2024-12-02 15:23:00.458106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.886 [2024-12-02 15:23:00.458122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.886 [2024-12-02 15:23:00.458182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.886 [2024-12-02 15:23:00.458198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.886 [2024-12-02 15:23:00.458261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:14.886 [2024-12-02 15:23:00.458277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.148 NEW_FUNC[1/671]: 0x44e908 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:15.148 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:15.148 #3 NEW cov: 11569 ft: 11573 corp: 2/36b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:15.148 [2024-12-02 15:23:00.758236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.758279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.148 NEW_FUNC[1/2]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:15.148 NEW_FUNC[2/2]: 0x1133ce8 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:07:15.148 #4 NEW cov: 11725 ft: 12458 corp: 3/50b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 1 InsertRepeatedBytes- 00:07:15.148 [2024-12-02 15:23:00.798307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.798336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.148 [2024-12-02 15:23:00.798396] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.798410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.148 [2024-12-02 15:23:00.798467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.798480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.148 #5 NEW cov: 11731 ft: 13000 corp: 4/72b lim: 35 exec/s: 0 rss: 69Mb L: 22/35 MS: 1 CrossOver- 00:07:15.148 [2024-12-02 15:23:00.838430] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.838459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.148 [2024-12-02 15:23:00.838523] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.838537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.148 [2024-12-02 15:23:00.838595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.148 [2024-12-02 15:23:00.838609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.149 #6 NEW cov: 11816 ft: 13273 corp: 5/99b lim: 35 exec/s: 0 rss: 69Mb L: 27/35 MS: 1 InsertRepeatedBytes- 00:07:15.149 [2024-12-02 15:23:00.888556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.149 [2024-12-02 15:23:00.888583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.149 [2024-12-02 15:23:00.888658] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.149 [2024-12-02 15:23:00.888672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.149 [2024-12-02 15:23:00.888738] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.149 [2024-12-02 15:23:00.888751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.149 #7 NEW cov: 11816 ft: 13433 corp: 6/121b lim: 35 exec/s: 0 rss: 69Mb L: 22/35 MS: 1 ChangeBinInt- 00:07:15.149 [2024-12-02 15:23:00.928704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.149 [2024-12-02 15:23:00.928737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.149 [2024-12-02 15:23:00.928799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.149 [2024-12-02 15:23:00.928814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.149 [2024-12-02 15:23:00.928871] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.149 [2024-12-02 15:23:00.928887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.149 #8 NEW cov: 11816 ft: 13510 corp: 7/145b lim: 35 exec/s: 0 rss: 69Mb L: 24/35 MS: 1 EraseBytes- 00:07:15.409 [2024-12-02 15:23:00.968764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:00.968790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:00.968850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:00.968864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:00.968921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:00.968935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.409 #9 NEW cov: 11816 ft: 13571 corp: 8/172b lim: 35 exec/s: 0 rss: 69Mb L: 27/35 MS: 1 ChangeBit- 00:07:15.409 #10 NEW cov: 11816 ft: 14155 corp: 9/180b lim: 35 exec/s: 0 rss: 69Mb L: 8/35 MS: 1 EraseBytes- 00:07:15.409 [2024-12-02 15:23:01.059217] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.059244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.059302] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.059316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.059390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.059405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.059463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.059477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.409 #11 NEW cov: 11816 ft: 14266 corp: 10/208b lim: 35 exec/s: 0 rss: 69Mb L: 28/35 MS: 1 InsertByte- 00:07:15.409 [2024-12-02 15:23:01.099071] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.099097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.409 #12 NEW cov: 11816 ft: 14300 corp: 11/222b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 1 CrossOver- 00:07:15.409 [2024-12-02 15:23:01.139497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.139522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.139583] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.139595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.139655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.139670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.409 #13 NEW cov: 11816 ft: 14328 corp: 12/251b lim: 35 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 CrossOver- 00:07:15.409 [2024-12-02 15:23:01.179386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.179413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.179474] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.179488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.409 [2024-12-02 15:23:01.179548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.409 [2024-12-02 15:23:01.179562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.409 #14 NEW cov: 11816 ft: 14337 corp: 13/278b lim: 35 exec/s: 0 rss: 69Mb L: 27/35 MS: 1 CrossOver- 00:07:15.409 [2024-12-02 15:23:01.219363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.410 [2024-12-02 15:23:01.219394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.670 #15 NEW cov: 11816 ft: 14354 corp: 14/294b lim: 35 exec/s: 0 rss: 69Mb L: 16/35 MS: 1 CMP- DE: "\006\000"- 00:07:15.670 [2024-12-02 15:23:01.259774] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.259800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.670 [2024-12-02 15:23:01.259858] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.259872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.670 [2024-12-02 15:23:01.259932] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.259945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.670 [2024-12-02 15:23:01.260005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.260020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.670 #16 NEW cov: 11816 ft: 14448 corp: 15/322b lim: 35 exec/s: 0 rss: 70Mb L: 28/35 MS: 1 InsertByte- 00:07:15.670 [2024-12-02 15:23:01.299915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.299940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.670 [2024-12-02 15:23:01.299999] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.300012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.670 [2024-12-02 15:23:01.300071] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.670 [2024-12-02 15:23:01.300085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.670 [2024-12-02 15:23:01.300140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.300154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.671 #17 NEW cov: 11816 ft: 14493 corp: 16/352b lim: 35 exec/s: 0 rss: 70Mb L: 30/35 MS: 1 PersAutoDict- DE: "\006\000"- 00:07:15.671 [2024-12-02 15:23:01.340066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.340091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.671 [2024-12-02 15:23:01.340150] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.340164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.671 [2024-12-02 15:23:01.340237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.340253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.671 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:15.671 #18 NEW cov: 11839 ft: 14599 corp: 17/383b lim: 35 exec/s: 0 rss: 70Mb L: 31/35 MS: 1 PersAutoDict- DE: "\006\000"- 00:07:15.671 [2024-12-02 15:23:01.390034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.390061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.671 [2024-12-02 15:23:01.390123] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.390137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.671 [2024-12-02 15:23:01.390196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.390208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.671 #19 NEW cov: 11839 ft: 14636 corp: 18/405b lim: 35 exec/s: 0 rss: 70Mb L: 22/35 MS: 1 ChangeBit- 00:07:15.671 [2024-12-02 15:23:01.430109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.430134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.671 [2024-12-02 15:23:01.430196] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.430209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.671 [2024-12-02 15:23:01.430270] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.430282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.671 #20 NEW cov: 11839 ft: 14696 corp: 19/427b lim: 35 exec/s: 20 rss: 70Mb L: 22/35 MS: 1 PersAutoDict- DE: "\006\000"- 00:07:15.671 [2024-12-02 15:23:01.470109] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.671 [2024-12-02 15:23:01.470135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.930 #21 NEW cov: 11839 ft: 14759 corp: 20/441b lim: 35 exec/s: 21 rss: 70Mb L: 14/35 MS: 1 PersAutoDict- DE: "\006\000"- 00:07:15.931 [2024-12-02 15:23:01.510508] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.510532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.510589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.510602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.510660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.510673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.510753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.510768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.931 #22 NEW cov: 11839 ft: 14773 corp: 21/469b lim: 35 exec/s: 22 rss: 70Mb L: 28/35 MS: 1 ChangeByte- 00:07:15.931 [2024-12-02 15:23:01.550639] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.550664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.550726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.550744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.550778] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.550792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.550852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.550867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.931 #23 NEW cov: 11839 ft: 14785 corp: 22/497b lim: 35 exec/s: 23 rss: 70Mb L: 28/35 MS: 1 CopyPart- 00:07:15.931 [2024-12-02 15:23:01.590769] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.590795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.590871] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.590885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.590919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.590932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.931 #24 NEW cov: 11839 ft: 14798 corp: 23/528b lim: 35 exec/s: 24 rss: 70Mb L: 31/35 MS: 1 CopyPart- 00:07:15.931 [2024-12-02 15:23:01.630728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.630756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.630817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.630833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 #25 NEW cov: 11839 ft: 14917 corp: 24/553b lim: 35 exec/s: 25 rss: 70Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:07:15.931 [2024-12-02 15:23:01.670986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.671011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.671069] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.671083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.671138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.671154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.931 #26 NEW cov: 11839 ft: 14931 corp: 25/585b lim: 35 exec/s: 26 rss: 70Mb L: 32/35 MS: 1 InsertByte- 00:07:15.931 [2024-12-02 15:23:01.711151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.711176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.711237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.711250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.931 [2024-12-02 15:23:01.711308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:15.931 [2024-12-02 15:23:01.711321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.931 #27 NEW cov: 11839 ft: 14937 corp: 26/619b lim: 35 exec/s: 27 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:07:16.190 [2024-12-02 15:23:01.750726] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.750755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 #28 NEW cov: 11839 ft: 14948 corp: 27/630b lim: 35 exec/s: 28 rss: 70Mb L: 11/35 MS: 1 EraseBytes- 00:07:16.191 [2024-12-02 15:23:01.791352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.791378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.791441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.791454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.791513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.791526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.791586] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.791602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.191 #29 NEW cov: 11839 ft: 15028 corp: 28/658b lim: 35 exec/s: 29 rss: 70Mb L: 28/35 MS: 1 CopyPart- 00:07:16.191 [2024-12-02 15:23:01.831167] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.831192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 #30 NEW cov: 11839 ft: 15092 corp: 29/674b lim: 35 exec/s: 30 rss: 70Mb L: 16/35 MS: 1 CMP- DE: "\015\000\000\000"- 00:07:16.191 [2024-12-02 15:23:01.871620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.871647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.871708] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.871725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.871791] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.871805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.191 #31 NEW cov: 11839 ft: 15110 corp: 30/706b lim: 35 exec/s: 31 rss: 70Mb L: 32/35 MS: 1 ChangeBinInt- 00:07:16.191 [2024-12-02 15:23:01.911339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.911364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.911423] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.911437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.951463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.951488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.951546] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.951562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 #33 NEW cov: 11839 ft: 15135 corp: 31/721b lim: 35 exec/s: 33 rss: 70Mb L: 15/35 MS: 2 InsertRepeatedBytes-ChangeBinInt- 00:07:16.191 [2024-12-02 15:23:01.991896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.991923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.991997] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.992011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.992070] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.992083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.191 [2024-12-02 15:23:01.992141] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.191 [2024-12-02 15:23:01.992155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.450 #34 NEW cov: 11839 ft: 15138 corp: 32/754b lim: 35 exec/s: 34 rss: 70Mb L: 33/35 MS: 1 InsertRepeatedBytes- 00:07:16.450 [2024-12-02 15:23:02.031921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.031948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.032007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.032022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.450 #35 NEW cov: 11839 ft: 15142 corp: 33/779b lim: 35 exec/s: 35 rss: 70Mb L: 25/35 MS: 1 ChangeByte- 00:07:16.450 [2024-12-02 15:23:02.071880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.071905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 #36 NEW cov: 11839 ft: 15144 corp: 34/793b lim: 35 exec/s: 36 rss: 70Mb L: 14/35 MS: 1 CopyPart- 00:07:16.450 [2024-12-02 15:23:02.112281] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.112307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.112364] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.112378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.112436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.112449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.112509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.112522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.450 #37 NEW cov: 11839 ft: 15161 corp: 35/824b lim: 35 exec/s: 37 rss: 70Mb L: 31/35 MS: 1 InsertByte- 00:07:16.450 [2024-12-02 15:23:02.152098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.152124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 #38 NEW cov: 11839 ft: 15172 corp: 36/840b lim: 35 exec/s: 38 rss: 70Mb L: 16/35 MS: 1 CopyPart- 00:07:16.450 [2024-12-02 15:23:02.192487] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.192515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.192594] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.192608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.192669] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.192682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.450 [2024-12-02 15:23:02.192743] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000074 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.192757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.450 #39 NEW cov: 11839 ft: 15183 corp: 37/873b lim: 35 exec/s: 39 rss: 70Mb L: 33/35 MS: 1 ChangeBit- 00:07:16.450 [2024-12-02 15:23:02.232359] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.450 [2024-12-02 15:23:02.232384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.450 #40 NEW cov: 11839 ft: 15192 corp: 38/888b lim: 35 exec/s: 40 rss: 70Mb L: 15/35 MS: 1 CopyPart- 00:07:16.709 [2024-12-02 15:23:02.272715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.272750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.272811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.272825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.272885] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.272900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.272958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000b5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.272972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 #41 NEW cov: 11839 ft: 15199 corp: 39/916b lim: 35 exec/s: 41 rss: 70Mb L: 28/35 MS: 1 CMP- DE: "*\257\246\374\265\006\000\000"- 00:07:16.709 [2024-12-02 15:23:02.312500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.312528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 #42 NEW cov: 11839 ft: 15232 corp: 40/932b lim: 35 exec/s: 42 rss: 70Mb L: 16/35 MS: 1 EraseBytes- 00:07:16.709 [2024-12-02 15:23:02.352902] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.352927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.352990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.353003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.353064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.353077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.353137] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:80000087 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.353153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 #43 NEW cov: 11839 ft: 15255 corp: 41/960b lim: 35 exec/s: 43 rss: 70Mb L: 28/35 MS: 1 CopyPart- 00:07:16.709 [2024-12-02 15:23:02.393084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.393110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.393171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.393185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:16.709 [2024-12-02 15:23:02.393246] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.393258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:16.709 #44 NEW cov: 11839 ft: 15271 corp: 42/994b lim: 35 exec/s: 44 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:07:16.709 [2024-12-02 15:23:02.432844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:16.709 [2024-12-02 15:23:02.432869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:16.709 #45 NEW cov: 11839 ft: 15275 corp: 43/1010b lim: 35 exec/s: 22 rss: 70Mb L: 16/35 MS: 1 ChangeByte- 00:07:16.709 #45 DONE cov: 11839 ft: 15275 corp: 43/1010b lim: 35 exec/s: 22 rss: 70Mb 00:07:16.709 ###### Recommended dictionary. ###### 00:07:16.709 "\006\000" # Uses: 4 00:07:16.709 "\015\000\000\000" # Uses: 0 00:07:16.709 "*\257\246\374\265\006\000\000" # Uses: 0 00:07:16.709 ###### End of recommended dictionary. ###### 00:07:16.709 Done 45 runs in 2 second(s) 00:07:16.969 15:23:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:16.969 15:23:02 -- ../common.sh@72 -- # (( i++ )) 00:07:16.969 15:23:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:16.969 15:23:02 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:16.969 15:23:02 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:16.969 15:23:02 -- nvmf/run.sh@24 -- # local timen=1 00:07:16.969 15:23:02 -- nvmf/run.sh@25 -- # local core=0x1 00:07:16.969 15:23:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:16.969 15:23:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:16.969 15:23:02 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:16.969 15:23:02 -- nvmf/run.sh@29 -- # port=4415 00:07:16.969 15:23:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:16.969 15:23:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:16.969 15:23:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:16.969 15:23:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:16.969 [2024-12-02 15:23:02.622171] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:16.969 [2024-12-02 15:23:02.622240] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1602237 ] 00:07:16.969 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.228 [2024-12-02 15:23:02.798137] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.228 [2024-12-02 15:23:02.860220] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:17.228 [2024-12-02 15:23:02.860351] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.228 [2024-12-02 15:23:02.918199] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:17.228 [2024-12-02 15:23:02.934533] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:17.228 INFO: Running with entropic power schedule (0xFF, 100). 00:07:17.228 INFO: Seed: 1634031473 00:07:17.228 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:17.228 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:17.228 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:17.228 INFO: A corpus is not provided, starting from an empty corpus 00:07:17.228 #2 INITED exec/s: 0 rss: 60Mb 00:07:17.228 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:17.228 This may also happen if the target rejected all inputs we tried so far 00:07:17.487 NEW_FUNC[1/659]: 0x44fe48 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:17.487 NEW_FUNC[2/659]: 0x46b4b8 in feat_temperature_threshold /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:295 00:07:17.487 #12 NEW cov: 11485 ft: 11486 corp: 2/12b lim: 35 exec/s: 0 rss: 68Mb L: 11/11 MS: 5 InsertByte-ChangeBit-ShuffleBytes-InsertByte-CMP- DE: "\004\000\000\000\000\000\000\000"- 00:07:17.747 [2024-12-02 15:23:03.325322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.747 [2024-12-02 15:23:03.325365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:17.747 [2024-12-02 15:23:03.325511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.747 [2024-12-02 15:23:03.325532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.747 [2024-12-02 15:23:03.325660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.747 [2024-12-02 15:23:03.325678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.747 [2024-12-02 15:23:03.325788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.747 [2024-12-02 15:23:03.325809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.748 NEW_FUNC[1/14]: 0x169c068 in spdk_nvme_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:263 00:07:17.748 NEW_FUNC[2/14]: 0x169c2a8 in nvme_admin_qpair_print_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:202 00:07:17.748 #16 NEW cov: 11728 ft: 12745 corp: 3/43b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 4 ShuffleBytes-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:07:17.748 [2024-12-02 15:23:03.375299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.375328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.748 [2024-12-02 15:23:03.375467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.375484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.748 #17 NEW cov: 11734 ft: 13142 corp: 4/68b lim: 35 exec/s: 0 rss: 68Mb L: 25/31 MS: 1 InsertRepeatedBytes- 00:07:17.748 [2024-12-02 15:23:03.425630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.425659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.748 [2024-12-02 15:23:03.425799] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.425819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.748 [2024-12-02 15:23:03.425950] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.425967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:17.748 NEW_FUNC[1/1]: 0x46a548 in feat_power_management /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:282 00:07:17.748 #19 NEW cov: 11842 ft: 13503 corp: 5/96b lim: 35 exec/s: 0 rss: 68Mb L: 28/31 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:17.748 [2024-12-02 15:23:03.465604] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.465635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.748 [2024-12-02 15:23:03.465786] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.465803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.748 #20 NEW cov: 11842 ft: 13639 corp: 6/121b lim: 35 exec/s: 0 rss: 68Mb L: 25/31 MS: 1 ChangeBinInt- 00:07:17.748 #21 NEW cov: 11842 ft: 13771 corp: 7/132b lim: 35 exec/s: 0 rss: 68Mb L: 11/31 MS: 1 ChangeBinInt- 00:07:17.748 [2024-12-02 15:23:03.556028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.556057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:17.748 [2024-12-02 15:23:03.556189] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.556208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:17.748 [2024-12-02 15:23:03.556339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.748 [2024-12-02 15:23:03.556356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.008 #22 NEW cov: 11842 ft: 13836 corp: 8/160b lim: 35 exec/s: 0 rss: 68Mb L: 28/31 MS: 1 ChangeBit- 00:07:18.008 [2024-12-02 15:23:03.606158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.606188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.606315] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.606334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.606463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.606481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.606614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.606642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.008 #23 NEW cov: 11842 ft: 13893 corp: 9/191b lim: 35 exec/s: 0 rss: 68Mb L: 31/31 MS: 1 ChangeByte- 00:07:18.008 [2024-12-02 15:23:03.656660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.656689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.656821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.656840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.656972] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.656990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.657114] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.657133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:18.008 #24 NEW cov: 11842 ft: 13957 corp: 10/226b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 CrossOver- 00:07:18.008 [2024-12-02 15:23:03.706435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.706462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.706577] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.706594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.706723] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.706745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.706879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.706895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.008 #25 NEW cov: 11842 ft: 14058 corp: 11/257b lim: 35 exec/s: 0 rss: 68Mb L: 31/35 MS: 1 CopyPart- 00:07:18.008 [2024-12-02 15:23:03.746662] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.746689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.746820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.746835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.746954] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.746971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.008 #26 NEW cov: 11842 ft: 14126 corp: 12/291b lim: 35 exec/s: 0 rss: 68Mb L: 34/35 MS: 1 CrossOver- 00:07:18.008 [2024-12-02 15:23:03.786171] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.786198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.008 [2024-12-02 15:23:03.786318] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.008 [2024-12-02 15:23:03.786335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.008 #32 NEW cov: 11842 ft: 14243 corp: 13/309b lim: 35 exec/s: 0 rss: 68Mb L: 18/35 MS: 1 CrossOver- 00:07:18.268 [2024-12-02 15:23:03.836614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.836642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.836815] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.836836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.836973] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.836990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.268 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:18.268 #38 NEW cov: 11865 ft: 14348 corp: 14/335b lim: 35 exec/s: 0 rss: 69Mb L: 26/35 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:07:18.268 [2024-12-02 15:23:03.887129] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.887157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.887293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.887313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.887436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.887454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.268 #39 NEW cov: 11865 ft: 14350 corp: 15/363b lim: 35 exec/s: 0 rss: 69Mb L: 28/35 MS: 1 ChangeBit- 00:07:18.268 [2024-12-02 15:23:03.937194] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.937223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.937360] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.937378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.937511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.937529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.268 #40 NEW cov: 11865 ft: 14431 corp: 16/391b lim: 35 exec/s: 0 rss: 69Mb L: 28/35 MS: 1 CopyPart- 00:07:18.268 [2024-12-02 15:23:03.977235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.977262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.977399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.977417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.977541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.977558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:03.977688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:03.977705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.268 #41 NEW cov: 11865 ft: 14467 corp: 17/419b lim: 35 exec/s: 41 rss: 69Mb L: 28/35 MS: 1 InsertRepeatedBytes- 00:07:18.268 [2024-12-02 15:23:04.017395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.017421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:04.017552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.017569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:04.017696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.017713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:04.017847] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.017863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.268 #42 NEW cov: 11865 ft: 14480 corp: 18/453b lim: 35 exec/s: 42 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:18.268 [2024-12-02 15:23:04.057588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.057615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:04.057749] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.057783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.268 [2024-12-02 15:23:04.057907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.268 [2024-12-02 15:23:04.057925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.268 #43 NEW cov: 11865 ft: 14498 corp: 19/484b lim: 35 exec/s: 43 rss: 69Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:07:18.528 [2024-12-02 15:23:04.097613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.097639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.097781] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.097800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.097937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000002af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.097956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.098073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000250 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.098091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.528 #44 NEW cov: 11865 ft: 14508 corp: 20/512b lim: 35 exec/s: 44 rss: 69Mb L: 28/35 MS: 1 ChangeBinInt- 00:07:18.528 NEW_FUNC[1/1]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:18.528 #47 NEW cov: 11879 ft: 14532 corp: 21/521b lim: 35 exec/s: 47 rss: 69Mb L: 9/35 MS: 3 CopyPart-EraseBytes-CMP- DE: "\003\000\000\000\000\000\000\000"- 00:07:18.528 [2024-12-02 15:23:04.187753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.187794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.187814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.187833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.528 #48 NEW cov: 11879 ft: 14544 corp: 22/546b lim: 35 exec/s: 48 rss: 69Mb L: 25/35 MS: 1 InsertRepeatedBytes- 00:07:18.528 [2024-12-02 15:23:04.237919] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.237946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.238074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.238092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.528 #49 NEW cov: 11879 ft: 14579 corp: 23/571b lim: 35 exec/s: 49 rss: 69Mb L: 25/35 MS: 1 ChangeBit- 00:07:18.528 [2024-12-02 15:23:04.278195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.278220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.278355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.278370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.278499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.278515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.278644] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.278660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.528 #50 NEW cov: 11879 ft: 14620 corp: 24/605b lim: 35 exec/s: 50 rss: 69Mb L: 34/35 MS: 1 ChangeBinInt- 00:07:18.528 [2024-12-02 15:23:04.318245] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.318271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.318406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.318422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.318541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:6 cdw10:0000000d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.318559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.528 [2024-12-02 15:23:04.318681] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.528 [2024-12-02 15:23:04.318699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.528 #51 NEW cov: 11879 ft: 14649 corp: 25/639b lim: 35 exec/s: 51 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:18.788 [2024-12-02 15:23:04.358332] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.358360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.358500] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.358516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.358612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000002af SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.358631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.358753] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000250 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.358771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.788 #52 NEW cov: 11879 ft: 14664 corp: 26/667b lim: 35 exec/s: 52 rss: 69Mb L: 28/35 MS: 1 CrossOver- 00:07:18.788 [2024-12-02 15:23:04.408605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.408632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.408758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.408786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.408909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000657 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.408924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.788 #53 NEW cov: 11879 ft: 14674 corp: 27/701b lim: 35 exec/s: 53 rss: 69Mb L: 34/35 MS: 1 ChangeBit- 00:07:18.788 [2024-12-02 15:23:04.448795] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.448820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.448937] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.448967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.449094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.449109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.788 #54 NEW cov: 11879 ft: 14690 corp: 28/729b lim: 35 exec/s: 54 rss: 70Mb L: 28/35 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:07:18.788 [2024-12-02 15:23:04.488907] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.488938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.489073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.489091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.489218] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.489234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.788 #55 NEW cov: 11879 ft: 14695 corp: 29/757b lim: 35 exec/s: 55 rss: 70Mb L: 28/35 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:07:18.788 #56 NEW cov: 11879 ft: 14704 corp: 30/766b lim: 35 exec/s: 56 rss: 70Mb L: 9/35 MS: 1 ChangeBinInt- 00:07:18.788 [2024-12-02 15:23:04.569002] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.569029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.569163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.569180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.569307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:6 cdw10:0000000d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.569324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:18.788 [2024-12-02 15:23:04.569449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.788 [2024-12-02 15:23:04.569465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:18.788 #57 NEW cov: 11879 ft: 14709 corp: 31/800b lim: 35 exec/s: 57 rss: 70Mb L: 34/35 MS: 1 ChangeByte- 00:07:19.048 [2024-12-02 15:23:04.609235] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.609263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.609393] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.609410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.609541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.609557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.048 #58 NEW cov: 11879 ft: 14726 corp: 32/828b lim: 35 exec/s: 58 rss: 70Mb L: 28/35 MS: 1 ChangeBit- 00:07:19.048 [2024-12-02 15:23:04.649276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.649303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.649440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.649461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.649588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES HOST MEM BUFFER cid:6 cdw10:0000000d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.649605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.649737] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.649756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.048 #59 NEW cov: 11879 ft: 14739 corp: 33/862b lim: 35 exec/s: 59 rss: 70Mb L: 34/35 MS: 1 ChangeByte- 00:07:19.048 [2024-12-02 15:23:04.689459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.689488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.689629] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.689648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.689783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.689800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.048 #60 NEW cov: 11879 ft: 14777 corp: 34/896b lim: 35 exec/s: 60 rss: 70Mb L: 34/35 MS: 1 CopyPart- 00:07:19.048 [2024-12-02 15:23:04.739205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.739235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.048 #61 NEW cov: 11879 ft: 14955 corp: 35/913b lim: 35 exec/s: 61 rss: 70Mb L: 17/35 MS: 1 InsertRepeatedBytes- 00:07:19.048 [2024-12-02 15:23:04.779470] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.779497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.779631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006d8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.779650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.048 #62 NEW cov: 11879 ft: 14969 corp: 36/938b lim: 35 exec/s: 62 rss: 70Mb L: 25/35 MS: 1 ChangeBit- 00:07:19.048 [2024-12-02 15:23:04.829593] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.829620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.829768] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.829786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.048 [2024-12-02 15:23:04.829925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.048 [2024-12-02 15:23:04.829942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.048 #63 NEW cov: 11879 ft: 15012 corp: 37/963b lim: 35 exec/s: 63 rss: 70Mb L: 25/35 MS: 1 EraseBytes- 00:07:19.308 [2024-12-02 15:23:04.879267] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.879295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:19.308 #64 NEW cov: 11879 ft: 15054 corp: 38/972b lim: 35 exec/s: 64 rss: 70Mb L: 9/35 MS: 1 EraseBytes- 00:07:19.308 [2024-12-02 15:23:04.920442] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.920469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.308 [2024-12-02 15:23:04.920609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.920626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.308 [2024-12-02 15:23:04.920747] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000657 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.920766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.308 [2024-12-02 15:23:04.920899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000057 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.920918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.308 #65 NEW cov: 11879 ft: 15075 corp: 39/1007b lim: 35 exec/s: 65 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:19.308 [2024-12-02 15:23:04.960427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.960453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:19.308 [2024-12-02 15:23:04.960588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.308 [2024-12-02 15:23:04.960607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:19.308 [2024-12-02 15:23:04.960736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000257 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.309 [2024-12-02 15:23:04.960753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:19.309 [2024-12-02 15:23:04.960893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000000e5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.309 [2024-12-02 15:23:04.960910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:19.309 #66 NEW cov: 11879 ft: 15082 corp: 40/1042b lim: 35 exec/s: 33 rss: 70Mb L: 35/35 MS: 1 InsertByte- 00:07:19.309 #66 DONE cov: 11879 ft: 15082 corp: 40/1042b lim: 35 exec/s: 33 rss: 70Mb 00:07:19.309 ###### Recommended dictionary. ###### 00:07:19.309 "\004\000\000\000\000\000\000\000" # Uses: 2 00:07:19.309 "\003\000\000\000\000\000\000\000" # Uses: 0 00:07:19.309 ###### End of recommended dictionary. ###### 00:07:19.309 Done 66 runs in 2 second(s) 00:07:19.309 15:23:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:19.309 15:23:05 -- ../common.sh@72 -- # (( i++ )) 00:07:19.309 15:23:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:19.309 15:23:05 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:19.309 15:23:05 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:19.309 15:23:05 -- nvmf/run.sh@24 -- # local timen=1 00:07:19.309 15:23:05 -- nvmf/run.sh@25 -- # local core=0x1 00:07:19.309 15:23:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:19.309 15:23:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:19.309 15:23:05 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:19.309 15:23:05 -- nvmf/run.sh@29 -- # port=4416 00:07:19.309 15:23:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:19.309 15:23:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:19.309 15:23:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:19.309 15:23:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:19.567 [2024-12-02 15:23:05.148833] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:19.567 [2024-12-02 15:23:05.148916] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1602557 ] 00:07:19.567 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.826 [2024-12-02 15:23:05.405622] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.826 [2024-12-02 15:23:05.489670] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:19.826 [2024-12-02 15:23:05.489819] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.826 [2024-12-02 15:23:05.547597] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:19.826 [2024-12-02 15:23:05.563917] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:19.826 INFO: Running with entropic power schedule (0xFF, 100). 00:07:19.826 INFO: Seed: 4264028133 00:07:19.826 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:19.826 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:19.826 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:19.826 INFO: A corpus is not provided, starting from an empty corpus 00:07:19.826 #2 INITED exec/s: 0 rss: 60Mb 00:07:19.826 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:19.826 This may also happen if the target rejected all inputs we tried so far 00:07:19.826 [2024-12-02 15:23:05.608893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:19.826 [2024-12-02 15:23:05.608923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.085 NEW_FUNC[1/671]: 0x451308 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:20.085 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:20.085 #12 NEW cov: 11663 ft: 11664 corp: 2/27b lim: 105 exec/s: 0 rss: 68Mb L: 26/26 MS: 5 CopyPart-EraseBytes-InsertRepeatedBytes-ChangeBinInt-CMP- DE: "\001\226K\274\0312&\014"- 00:07:20.345 [2024-12-02 15:23:05.931033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:05.931094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.345 [2024-12-02 15:23:05.931242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:05.931274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.345 [2024-12-02 15:23:05.931416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:05.931454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.345 [2024-12-02 15:23:05.931598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:05.931630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.345 #13 NEW cov: 11776 ft: 13068 corp: 3/118b lim: 105 exec/s: 0 rss: 68Mb L: 91/91 MS: 1 InsertRepeatedBytes- 00:07:20.345 [2024-12-02 15:23:05.980030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:05.980057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.345 #14 NEW cov: 11782 ft: 13299 corp: 4/144b lim: 105 exec/s: 0 rss: 68Mb L: 26/91 MS: 1 ChangeBinInt- 00:07:20.345 [2024-12-02 15:23:06.020190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10272304543006887566 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:06.020216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.345 #15 NEW cov: 11867 ft: 13543 corp: 5/179b lim: 105 exec/s: 0 rss: 68Mb L: 35/91 MS: 1 InsertRepeatedBytes- 00:07:20.345 [2024-12-02 15:23:06.070845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:06.070871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.345 #16 NEW cov: 11867 ft: 13648 corp: 6/205b lim: 105 exec/s: 0 rss: 68Mb L: 26/91 MS: 1 PersAutoDict- DE: "\001\226K\274\0312&\014"- 00:07:20.345 [2024-12-02 15:23:06.120925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:06.120951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.345 #22 NEW cov: 11867 ft: 13715 corp: 7/231b lim: 105 exec/s: 0 rss: 68Mb L: 26/91 MS: 1 CrossOver- 00:07:20.345 [2024-12-02 15:23:06.160982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.345 [2024-12-02 15:23:06.161011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.605 #23 NEW cov: 11867 ft: 13784 corp: 8/257b lim: 105 exec/s: 0 rss: 68Mb L: 26/91 MS: 1 ChangeBinInt- 00:07:20.605 [2024-12-02 15:23:06.211313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.211339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.605 [2024-12-02 15:23:06.211464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.211496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.605 #26 NEW cov: 11867 ft: 14101 corp: 9/306b lim: 105 exec/s: 0 rss: 68Mb L: 49/91 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:07:20.605 [2024-12-02 15:23:06.250912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:55796 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.250943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.605 #27 NEW cov: 11867 ft: 14159 corp: 10/332b lim: 105 exec/s: 0 rss: 68Mb L: 26/91 MS: 1 ChangeBinInt- 00:07:20.605 [2024-12-02 15:23:06.291635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.291663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.605 [2024-12-02 15:23:06.291816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12154274936703723688 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.291839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.605 #28 NEW cov: 11867 ft: 14198 corp: 11/381b lim: 105 exec/s: 0 rss: 68Mb L: 49/91 MS: 1 ChangeBit- 00:07:20.605 [2024-12-02 15:23:06.331376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.331402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.605 #29 NEW cov: 11867 ft: 14218 corp: 12/407b lim: 105 exec/s: 0 rss: 68Mb L: 26/91 MS: 1 ChangeBit- 00:07:20.605 [2024-12-02 15:23:06.381670] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10272304541882814094 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.381701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.605 #32 NEW cov: 11867 ft: 14250 corp: 13/442b lim: 105 exec/s: 0 rss: 68Mb L: 35/91 MS: 3 EraseBytes-ShuffleBytes-CrossOver- 00:07:20.605 [2024-12-02 15:23:06.421816] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10272304541882814094 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.605 [2024-12-02 15:23:06.421848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.864 #33 NEW cov: 11867 ft: 14288 corp: 14/477b lim: 105 exec/s: 0 rss: 68Mb L: 35/91 MS: 1 ChangeBinInt- 00:07:20.864 [2024-12-02 15:23:06.462036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2382364672 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.462067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.864 [2024-12-02 15:23:06.462203] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10272304543006887566 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.462226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.864 #39 NEW cov: 11867 ft: 14312 corp: 15/523b lim: 105 exec/s: 0 rss: 69Mb L: 46/91 MS: 1 InsertRepeatedBytes- 00:07:20.864 [2024-12-02 15:23:06.512173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10272304541882814094 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.512206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.864 [2024-12-02 15:23:06.512322] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:10272304543006887566 len:36541 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.512345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.864 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:20.864 #40 NEW cov: 11890 ft: 14352 corp: 16/569b lim: 105 exec/s: 0 rss: 69Mb L: 46/91 MS: 1 InsertRepeatedBytes- 00:07:20.864 [2024-12-02 15:23:06.562341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:134217728 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.562373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.864 [2024-12-02 15:23:06.562473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.562497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.864 [2024-12-02 15:23:06.562609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.562634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.864 [2024-12-02 15:23:06.562755] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.562779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.864 #41 NEW cov: 11890 ft: 14394 corp: 17/660b lim: 105 exec/s: 0 rss: 69Mb L: 91/91 MS: 1 ChangeBinInt- 00:07:20.864 [2024-12-02 15:23:06.622576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153148316852988072 len:48154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.622608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.864 [2024-12-02 15:23:06.622741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.864 [2024-12-02 15:23:06.622765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.865 #42 NEW cov: 11890 ft: 14418 corp: 18/709b lim: 105 exec/s: 42 rss: 69Mb L: 49/91 MS: 1 PersAutoDict- DE: "\001\226K\274\0312&\014"- 00:07:20.865 [2024-12-02 15:23:06.672785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:41129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.865 [2024-12-02 15:23:06.672817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.865 [2024-12-02 15:23:06.672918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:20.865 [2024-12-02 15:23:06.672944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.124 #43 NEW cov: 11890 ft: 14500 corp: 19/758b lim: 105 exec/s: 43 rss: 69Mb L: 49/91 MS: 1 ChangeBit- 00:07:21.124 [2024-12-02 15:23:06.722719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.722759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.124 #44 NEW cov: 11890 ft: 14511 corp: 20/784b lim: 105 exec/s: 44 rss: 69Mb L: 26/91 MS: 1 ChangeBinInt- 00:07:21.124 [2024-12-02 15:23:06.762352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10272304543006887566 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.762380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.124 #45 NEW cov: 11890 ft: 14536 corp: 21/819b lim: 105 exec/s: 45 rss: 69Mb L: 35/91 MS: 1 ChangeBinInt- 00:07:21.124 [2024-12-02 15:23:06.802528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.802557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.124 #46 NEW cov: 11890 ft: 14551 corp: 22/845b lim: 105 exec/s: 46 rss: 69Mb L: 26/91 MS: 1 ChangeASCIIInt- 00:07:21.124 [2024-12-02 15:23:06.853356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153169924833454248 len:43169 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.853389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.124 [2024-12-02 15:23:06.853479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.853501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.124 #47 NEW cov: 11890 ft: 14564 corp: 23/895b lim: 105 exec/s: 47 rss: 69Mb L: 50/91 MS: 1 InsertByte- 00:07:21.124 [2024-12-02 15:23:06.913608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:41129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.913639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.124 [2024-12-02 15:23:06.913761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.913785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.124 [2024-12-02 15:23:06.913916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:576460755133049000 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.124 [2024-12-02 15:23:06.913939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.124 #48 NEW cov: 11890 ft: 14894 corp: 24/962b lim: 105 exec/s: 48 rss: 69Mb L: 67/91 MS: 1 CrossOver- 00:07:21.383 [2024-12-02 15:23:06.952975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:06.953001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.383 #49 NEW cov: 11890 ft: 14905 corp: 25/988b lim: 105 exec/s: 49 rss: 69Mb L: 26/91 MS: 1 CrossOver- 00:07:21.383 [2024-12-02 15:23:06.993504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:06.993529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.383 #50 NEW cov: 11890 ft: 14936 corp: 26/1014b lim: 105 exec/s: 50 rss: 69Mb L: 26/91 MS: 1 ChangeBinInt- 00:07:21.383 [2024-12-02 15:23:07.033706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:07.033738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.383 [2024-12-02 15:23:07.033862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1103806922752 len:2636 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:07.033888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.383 #51 NEW cov: 11890 ft: 14963 corp: 27/1066b lim: 105 exec/s: 51 rss: 69Mb L: 52/91 MS: 1 CrossOver- 00:07:21.383 [2024-12-02 15:23:07.083672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:86131987118555136 len:3261 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:07.083702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.383 #52 NEW cov: 11890 ft: 14976 corp: 28/1092b lim: 105 exec/s: 52 rss: 69Mb L: 26/91 MS: 1 ShuffleBytes- 00:07:21.383 [2024-12-02 15:23:07.123415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:4265472441318375424 len:3261 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:07.123447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.383 #53 NEW cov: 11890 ft: 15002 corp: 29/1118b lim: 105 exec/s: 53 rss: 69Mb L: 26/91 MS: 1 ChangeByte- 00:07:21.383 [2024-12-02 15:23:07.164321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:41129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:07.164352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.383 [2024-12-02 15:23:07.164455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.383 [2024-12-02 15:23:07.164477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.383 [2024-12-02 15:23:07.164598] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:576460755133049000 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.384 [2024-12-02 15:23:07.164620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.384 #54 NEW cov: 11890 ft: 15007 corp: 30/1187b lim: 105 exec/s: 54 rss: 70Mb L: 69/91 MS: 1 CopyPart- 00:07:21.643 [2024-12-02 15:23:07.224382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:114362111370461184 len:9741 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.224412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.224531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:1103806922752 len:2636 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.224552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.643 #55 NEW cov: 11890 ft: 15063 corp: 31/1239b lim: 105 exec/s: 55 rss: 70Mb L: 52/91 MS: 1 ChangeByte- 00:07:21.643 [2024-12-02 15:23:07.264189] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:41129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.264219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.264311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.264333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.264457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12153149157055965352 len:50373 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.264481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.643 #56 NEW cov: 11890 ft: 15076 corp: 32/1314b lim: 105 exec/s: 56 rss: 70Mb L: 75/91 MS: 1 InsertRepeatedBytes- 00:07:21.643 [2024-12-02 15:23:07.314987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:40969 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.315020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.315115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153140240703858856 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.315137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.315262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.315282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.315403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.315425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.643 #57 NEW cov: 11890 ft: 15085 corp: 33/1409b lim: 105 exec/s: 57 rss: 70Mb L: 95/95 MS: 1 CrossOver- 00:07:21.643 [2024-12-02 15:23:07.364678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:10272304541882814094 len:36495 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.364702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.643 #58 NEW cov: 11890 ft: 15180 corp: 34/1444b lim: 105 exec/s: 58 rss: 70Mb L: 35/95 MS: 1 ShuffleBytes- 00:07:21.643 [2024-12-02 15:23:07.415566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:41129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.415597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.415692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.643 [2024-12-02 15:23:07.415716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.643 [2024-12-02 15:23:07.415834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:576460755133049000 len:80 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.644 [2024-12-02 15:23:07.415850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.644 [2024-12-02 15:23:07.415964] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:5714873654208057167 len:20304 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.644 [2024-12-02 15:23:07.415985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.644 [2024-12-02 15:23:07.416120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:5714873314905640783 len:43009 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.644 [2024-12-02 15:23:07.416142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:21.644 #59 NEW cov: 11890 ft: 15237 corp: 35/1549b lim: 105 exec/s: 59 rss: 70Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:07:21.644 [2024-12-02 15:23:07.454632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153148316852988072 len:48154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.644 [2024-12-02 15:23:07.454661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.644 [2024-12-02 15:23:07.454786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.644 [2024-12-02 15:23:07.454809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.903 #60 NEW cov: 11890 ft: 15307 corp: 36/1598b lim: 105 exec/s: 60 rss: 70Mb L: 49/105 MS: 1 ChangeBit- 00:07:21.903 [2024-12-02 15:23:07.515247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153148316852988072 len:48154 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.903 [2024-12-02 15:23:07.515282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.903 [2024-12-02 15:23:07.515395] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12153149036796881064 len:43177 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.903 [2024-12-02 15:23:07.515426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.903 #61 NEW cov: 11890 ft: 15316 corp: 37/1655b lim: 105 exec/s: 61 rss: 70Mb L: 57/105 MS: 1 PersAutoDict- DE: "\001\226K\274\0312&\014"- 00:07:21.903 [2024-12-02 15:23:07.554820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12153149034112526504 len:41129 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.903 [2024-12-02 15:23:07.554845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.903 #62 NEW cov: 11890 ft: 15325 corp: 38/1690b lim: 105 exec/s: 62 rss: 70Mb L: 35/105 MS: 1 EraseBytes- 00:07:21.903 [2024-12-02 15:23:07.594860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:5457264551005257728 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:21.903 [2024-12-02 15:23:07.594893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.903 #63 NEW cov: 11890 ft: 15334 corp: 39/1716b lim: 105 exec/s: 31 rss: 70Mb L: 26/105 MS: 1 CopyPart- 00:07:21.903 #63 DONE cov: 11890 ft: 15334 corp: 39/1716b lim: 105 exec/s: 31 rss: 70Mb 00:07:21.903 ###### Recommended dictionary. ###### 00:07:21.903 "\001\226K\274\0312&\014" # Uses: 3 00:07:21.903 ###### End of recommended dictionary. ###### 00:07:21.903 Done 63 runs in 2 second(s) 00:07:22.163 15:23:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:22.163 15:23:07 -- ../common.sh@72 -- # (( i++ )) 00:07:22.163 15:23:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:22.163 15:23:07 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:22.163 15:23:07 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:22.163 15:23:07 -- nvmf/run.sh@24 -- # local timen=1 00:07:22.163 15:23:07 -- nvmf/run.sh@25 -- # local core=0x1 00:07:22.163 15:23:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:22.163 15:23:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:22.163 15:23:07 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:22.163 15:23:07 -- nvmf/run.sh@29 -- # port=4417 00:07:22.163 15:23:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:22.163 15:23:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:22.163 15:23:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:22.163 15:23:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:22.163 [2024-12-02 15:23:07.780744] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:22.163 [2024-12-02 15:23:07.780809] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603077 ] 00:07:22.163 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.163 [2024-12-02 15:23:07.956575] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.422 [2024-12-02 15:23:08.018673] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:22.422 [2024-12-02 15:23:08.018799] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.422 [2024-12-02 15:23:08.076552] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:22.422 [2024-12-02 15:23:08.092850] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:22.422 INFO: Running with entropic power schedule (0xFF, 100). 00:07:22.422 INFO: Seed: 2499076096 00:07:22.422 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:22.422 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:22.422 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:22.422 INFO: A corpus is not provided, starting from an empty corpus 00:07:22.422 #2 INITED exec/s: 0 rss: 61Mb 00:07:22.422 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:22.422 This may also happen if the target rejected all inputs we tried so far 00:07:22.422 [2024-12-02 15:23:08.158761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.422 [2024-12-02 15:23:08.158800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.422 [2024-12-02 15:23:08.158922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.422 [2024-12-02 15:23:08.158942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.682 NEW_FUNC[1/672]: 0x4545f8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:22.682 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.682 #7 NEW cov: 11684 ft: 11683 corp: 2/50b lim: 120 exec/s: 0 rss: 68Mb L: 49/49 MS: 5 CopyPart-InsertByte-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:22.682 [2024-12-02 15:23:08.489723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.682 [2024-12-02 15:23:08.489792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.942 #8 NEW cov: 11797 ft: 13200 corp: 3/75b lim: 120 exec/s: 0 rss: 68Mb L: 25/49 MS: 1 EraseBytes- 00:07:22.942 [2024-12-02 15:23:08.539655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.539691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.942 #9 NEW cov: 11803 ft: 13358 corp: 4/100b lim: 120 exec/s: 0 rss: 68Mb L: 25/49 MS: 1 ChangeBinInt- 00:07:22.942 [2024-12-02 15:23:08.589968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.590002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.590132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.590152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.590272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.590295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.942 #10 NEW cov: 11888 ft: 13975 corp: 5/178b lim: 120 exec/s: 0 rss: 69Mb L: 78/78 MS: 1 InsertRepeatedBytes- 00:07:22.942 [2024-12-02 15:23:08.640486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.640517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.640633] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.640656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.640777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.640798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.942 #11 NEW cov: 11888 ft: 14113 corp: 6/257b lim: 120 exec/s: 0 rss: 69Mb L: 79/79 MS: 1 InsertByte- 00:07:22.942 [2024-12-02 15:23:08.690702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.690739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.690842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.690866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.690977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.690999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.942 #13 NEW cov: 11888 ft: 14204 corp: 7/332b lim: 120 exec/s: 0 rss: 69Mb L: 75/79 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:22.942 [2024-12-02 15:23:08.730793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.730826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.730929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.730947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.942 [2024-12-02 15:23:08.731065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:22.942 [2024-12-02 15:23:08.731091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.942 #14 NEW cov: 11888 ft: 14268 corp: 8/407b lim: 120 exec/s: 0 rss: 69Mb L: 75/79 MS: 1 ShuffleBytes- 00:07:23.202 [2024-12-02 15:23:08.780758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.780791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.780910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.780935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.202 #15 NEW cov: 11888 ft: 14331 corp: 9/478b lim: 120 exec/s: 0 rss: 69Mb L: 71/79 MS: 1 EraseBytes- 00:07:23.202 [2024-12-02 15:23:08.821052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.821082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.821198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.821218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.821343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.821367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.202 #16 NEW cov: 11888 ft: 14370 corp: 10/553b lim: 120 exec/s: 0 rss: 69Mb L: 75/79 MS: 1 ShuffleBytes- 00:07:23.202 [2024-12-02 15:23:08.861501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.861531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.861630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.861652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.861767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:889058230272 len:53200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.861784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.861898] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.861918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.202 #17 NEW cov: 11888 ft: 14741 corp: 11/664b lim: 120 exec/s: 0 rss: 69Mb L: 111/111 MS: 1 InsertRepeatedBytes- 00:07:23.202 [2024-12-02 15:23:08.911459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.911491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.911627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3617008641903833650 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.202 [2024-12-02 15:23:08.911652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.202 [2024-12-02 15:23:08.911771] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.203 [2024-12-02 15:23:08.911798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.203 #18 NEW cov: 11888 ft: 14779 corp: 12/755b lim: 120 exec/s: 0 rss: 69Mb L: 91/111 MS: 1 InsertRepeatedBytes- 00:07:23.203 [2024-12-02 15:23:08.951160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.203 [2024-12-02 15:23:08.951185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.203 [2024-12-02 15:23:08.951326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.203 [2024-12-02 15:23:08.951346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.203 #19 NEW cov: 11888 ft: 14826 corp: 13/826b lim: 120 exec/s: 0 rss: 69Mb L: 71/111 MS: 1 ChangeBinInt- 00:07:23.203 [2024-12-02 15:23:08.991589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.203 [2024-12-02 15:23:08.991616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.203 [2024-12-02 15:23:08.991739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.203 [2024-12-02 15:23:08.991764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.203 [2024-12-02 15:23:08.991902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.203 [2024-12-02 15:23:08.991923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.203 #20 NEW cov: 11888 ft: 14863 corp: 14/901b lim: 120 exec/s: 0 rss: 69Mb L: 75/111 MS: 1 CopyPart- 00:07:23.462 [2024-12-02 15:23:09.031245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.031275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.462 [2024-12-02 15:23:09.031384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1834218991765451892 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.031417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.462 [2024-12-02 15:23:09.031541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.031564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.462 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:23.462 #21 NEW cov: 11911 ft: 14900 corp: 15/980b lim: 120 exec/s: 0 rss: 69Mb L: 79/111 MS: 1 ChangeByte- 00:07:23.462 [2024-12-02 15:23:09.081287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1801439853615775744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.081312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.462 #22 NEW cov: 11911 ft: 14925 corp: 16/1005b lim: 120 exec/s: 0 rss: 69Mb L: 25/111 MS: 1 ChangeBinInt- 00:07:23.462 [2024-12-02 15:23:09.121933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.121965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.462 [2024-12-02 15:23:09.122100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1834218991765451892 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.122121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.462 [2024-12-02 15:23:09.122244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.122273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.462 #23 NEW cov: 11911 ft: 14949 corp: 17/1084b lim: 120 exec/s: 23 rss: 69Mb L: 79/111 MS: 1 ShuffleBytes- 00:07:23.462 [2024-12-02 15:23:09.162185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.462 [2024-12-02 15:23:09.162213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.463 [2024-12-02 15:23:09.162323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.463 [2024-12-02 15:23:09.162346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.463 [2024-12-02 15:23:09.162466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.463 [2024-12-02 15:23:09.162485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.463 [2024-12-02 15:23:09.162599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.463 [2024-12-02 15:23:09.162619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.463 #24 NEW cov: 11911 ft: 14957 corp: 18/1196b lim: 120 exec/s: 24 rss: 69Mb L: 112/112 MS: 1 CopyPart- 00:07:23.463 [2024-12-02 15:23:09.201990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.463 [2024-12-02 15:23:09.202022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.463 [2024-12-02 15:23:09.202169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:144115188075855872 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.463 [2024-12-02 15:23:09.202194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.463 #25 NEW cov: 11911 ft: 14971 corp: 19/1245b lim: 120 exec/s: 25 rss: 69Mb L: 49/112 MS: 1 ChangeBit- 00:07:23.463 [2024-12-02 15:23:09.241777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.463 [2024-12-02 15:23:09.241803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.463 #26 NEW cov: 11911 ft: 14987 corp: 20/1290b lim: 120 exec/s: 26 rss: 69Mb L: 45/112 MS: 1 EraseBytes- 00:07:23.722 [2024-12-02 15:23:09.281978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1801439853615775744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.282004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.722 #27 NEW cov: 11911 ft: 15035 corp: 21/1315b lim: 120 exec/s: 27 rss: 69Mb L: 25/112 MS: 1 ChangeBit- 00:07:23.722 [2024-12-02 15:23:09.322006] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1801440394781655040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.322039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.722 #28 NEW cov: 11911 ft: 15058 corp: 22/1340b lim: 120 exec/s: 28 rss: 69Mb L: 25/112 MS: 1 ChangeByte- 00:07:23.722 [2024-12-02 15:23:09.362468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.362498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.722 [2024-12-02 15:23:09.362616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.362637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.722 #29 NEW cov: 11911 ft: 15096 corp: 23/1405b lim: 120 exec/s: 29 rss: 70Mb L: 65/112 MS: 1 CrossOver- 00:07:23.722 [2024-12-02 15:23:09.402807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.402838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.722 [2024-12-02 15:23:09.402939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460051558072319 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.402961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.722 [2024-12-02 15:23:09.403079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8358759473895208052 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.403100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.722 #30 NEW cov: 11911 ft: 15104 corp: 24/1483b lim: 120 exec/s: 30 rss: 70Mb L: 78/112 MS: 1 InsertRepeatedBytes- 00:07:23.722 [2024-12-02 15:23:09.443059] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:277550143744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.443089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.722 [2024-12-02 15:23:09.443199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1834218991765451892 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.443220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.722 [2024-12-02 15:23:09.443340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.443362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.722 #31 NEW cov: 11911 ft: 15107 corp: 25/1562b lim: 120 exec/s: 31 rss: 70Mb L: 79/112 MS: 1 ChangeBit- 00:07:23.722 [2024-12-02 15:23:09.482846] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.722 [2024-12-02 15:23:09.482876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.723 [2024-12-02 15:23:09.482998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:144115188075855872 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.723 [2024-12-02 15:23:09.483020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.723 #32 NEW cov: 11911 ft: 15124 corp: 26/1611b lim: 120 exec/s: 32 rss: 70Mb L: 49/112 MS: 1 CrossOver- 00:07:23.723 [2024-12-02 15:23:09.522660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.723 [2024-12-02 15:23:09.522684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 #33 NEW cov: 11911 ft: 15132 corp: 27/1636b lim: 120 exec/s: 33 rss: 70Mb L: 25/112 MS: 1 CrossOver- 00:07:23.982 [2024-12-02 15:23:09.563528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.563563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.563651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.563670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.563799] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.563821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.563940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.563962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.982 #34 NEW cov: 11911 ft: 15145 corp: 28/1748b lim: 120 exec/s: 34 rss: 70Mb L: 112/112 MS: 1 ChangeBit- 00:07:23.982 [2024-12-02 15:23:09.613476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.613508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.613625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8391460049216893977 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.613646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.613761] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8391460049216894068 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.613783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.982 #35 NEW cov: 11911 ft: 15171 corp: 29/1826b lim: 120 exec/s: 35 rss: 70Mb L: 78/112 MS: 1 EraseBytes- 00:07:23.982 [2024-12-02 15:23:09.653559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.653594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.653707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:11863788345444574372 len:42149 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.653727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.653859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:11863788345444574372 len:42149 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.653881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.703542] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.703575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.703688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:11863788345444574372 len:42149 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.703709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.703830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:11863788345444574372 len:42149 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.703852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.982 #37 NEW cov: 11911 ft: 15186 corp: 30/1915b lim: 120 exec/s: 37 rss: 70Mb L: 89/112 MS: 2 InsertRepeatedBytes-ChangeBinInt- 00:07:23.982 [2024-12-02 15:23:09.742885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672236800 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.742917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 #38 NEW cov: 11911 ft: 15207 corp: 31/1940b lim: 120 exec/s: 38 rss: 70Mb L: 25/112 MS: 1 CopyPart- 00:07:23.982 [2024-12-02 15:23:09.793750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2672232970 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.793788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.982 [2024-12-02 15:23:09.793905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:23.982 [2024-12-02 15:23:09.793928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.242 #39 NEW cov: 11911 ft: 15255 corp: 32/1989b lim: 120 exec/s: 39 rss: 70Mb L: 49/112 MS: 1 CrossOver- 00:07:24.242 [2024-12-02 15:23:09.843821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.843851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:09.843942] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.843967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.242 #43 NEW cov: 11911 ft: 15294 corp: 33/2054b lim: 120 exec/s: 43 rss: 70Mb L: 65/112 MS: 4 EraseBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:24.242 [2024-12-02 15:23:09.884247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1801439853615775744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.884281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:09.884401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.884424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:09.884558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.884583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.242 #44 NEW cov: 11911 ft: 15379 corp: 34/2138b lim: 120 exec/s: 44 rss: 70Mb L: 84/112 MS: 1 InsertRepeatedBytes- 00:07:24.242 [2024-12-02 15:23:09.934612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.934646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:09.934747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:4278190080 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.934771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:09.934886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.934905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:09.935017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.935040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.242 #45 NEW cov: 11911 ft: 15451 corp: 35/2250b lim: 120 exec/s: 45 rss: 70Mb L: 112/112 MS: 1 ChangeBinInt- 00:07:24.242 [2024-12-02 15:23:09.983617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:32779141535701248 len:29813 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:09.983643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.242 #46 NEW cov: 11911 ft: 15462 corp: 36/2294b lim: 120 exec/s: 46 rss: 70Mb L: 44/112 MS: 1 EraseBytes- 00:07:24.242 [2024-12-02 15:23:10.045105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:10.045136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:10.045224] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:10.045244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:10.045358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:889058230272 len:53200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:10.045378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.242 [2024-12-02 15:23:10.045499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:14974415777481871311 len:53200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.242 [2024-12-02 15:23:10.045522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.502 #47 NEW cov: 11911 ft: 15581 corp: 37/2405b lim: 120 exec/s: 47 rss: 70Mb L: 111/112 MS: 1 ChangeBinInt- 00:07:24.502 [2024-12-02 15:23:10.105003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.502 [2024-12-02 15:23:10.105040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.502 [2024-12-02 15:23:10.105156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:3614475367113437746 len:12851 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.502 [2024-12-02 15:23:10.105176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.502 [2024-12-02 15:23:10.105291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.502 [2024-12-02 15:23:10.105316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.502 #48 NEW cov: 11911 ft: 15601 corp: 38/2497b lim: 120 exec/s: 48 rss: 70Mb L: 92/112 MS: 1 InsertByte- 00:07:24.502 [2024-12-02 15:23:10.154699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:15770157678700714714 len:56027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.502 [2024-12-02 15:23:10.154731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.502 [2024-12-02 15:23:10.154862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:15770157678700714714 len:56027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.502 [2024-12-02 15:23:10.154887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.502 [2024-12-02 15:23:10.155009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:15770157678700714714 len:56027 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:24.502 [2024-12-02 15:23:10.155031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.502 #49 NEW cov: 11911 ft: 15610 corp: 39/2571b lim: 120 exec/s: 24 rss: 70Mb L: 74/112 MS: 1 InsertRepeatedBytes- 00:07:24.502 #49 DONE cov: 11911 ft: 15610 corp: 39/2571b lim: 120 exec/s: 24 rss: 70Mb 00:07:24.502 Done 49 runs in 2 second(s) 00:07:24.502 15:23:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:24.502 15:23:10 -- ../common.sh@72 -- # (( i++ )) 00:07:24.502 15:23:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:24.502 15:23:10 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:24.502 15:23:10 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:24.502 15:23:10 -- nvmf/run.sh@24 -- # local timen=1 00:07:24.502 15:23:10 -- nvmf/run.sh@25 -- # local core=0x1 00:07:24.502 15:23:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:24.502 15:23:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:24.502 15:23:10 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:24.502 15:23:10 -- nvmf/run.sh@29 -- # port=4418 00:07:24.502 15:23:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:24.502 15:23:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:24.502 15:23:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.502 15:23:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:24.762 [2024-12-02 15:23:10.331600] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:24.762 [2024-12-02 15:23:10.331668] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603616 ] 00:07:24.762 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.762 [2024-12-02 15:23:10.507986] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.762 [2024-12-02 15:23:10.569277] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.762 [2024-12-02 15:23:10.569399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.022 [2024-12-02 15:23:10.627259] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:25.022 [2024-12-02 15:23:10.643552] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:25.022 INFO: Running with entropic power schedule (0xFF, 100). 00:07:25.022 INFO: Seed: 754102353 00:07:25.022 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:25.022 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:25.022 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:25.022 INFO: A corpus is not provided, starting from an empty corpus 00:07:25.022 #2 INITED exec/s: 0 rss: 60Mb 00:07:25.022 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:25.022 This may also happen if the target rejected all inputs we tried so far 00:07:25.022 [2024-12-02 15:23:10.688971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.022 [2024-12-02 15:23:10.689000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.022 [2024-12-02 15:23:10.689036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.022 [2024-12-02 15:23:10.689050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.022 [2024-12-02 15:23:10.689102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.022 [2024-12-02 15:23:10.689117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.022 [2024-12-02 15:23:10.689169] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.022 [2024-12-02 15:23:10.689183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.281 NEW_FUNC[1/670]: 0x457e58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:25.281 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:25.281 #3 NEW cov: 11628 ft: 11629 corp: 2/86b lim: 100 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:07:25.281 [2024-12-02 15:23:10.999650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.281 [2024-12-02 15:23:10.999682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:10.999721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.281 [2024-12-02 15:23:10.999738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:10.999789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.281 [2024-12-02 15:23:10.999802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:10.999854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.281 [2024-12-02 15:23:10.999867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.281 #9 NEW cov: 11741 ft: 11949 corp: 3/179b lim: 100 exec/s: 0 rss: 68Mb L: 93/93 MS: 1 CMP- DE: "\273\374\255:\277K\226\000"- 00:07:25.281 [2024-12-02 15:23:11.039675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.281 [2024-12-02 15:23:11.039702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:11.039751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.281 [2024-12-02 15:23:11.039765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:11.039818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.281 [2024-12-02 15:23:11.039833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:11.039883] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.281 [2024-12-02 15:23:11.039895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.281 #10 NEW cov: 11747 ft: 12208 corp: 4/265b lim: 100 exec/s: 0 rss: 68Mb L: 86/93 MS: 1 CrossOver- 00:07:25.281 [2024-12-02 15:23:11.079798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.281 [2024-12-02 15:23:11.079826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:11.079863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.281 [2024-12-02 15:23:11.079876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:11.079927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.281 [2024-12-02 15:23:11.079941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.281 [2024-12-02 15:23:11.079994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.281 [2024-12-02 15:23:11.080007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.541 #11 NEW cov: 11832 ft: 12445 corp: 5/363b lim: 100 exec/s: 0 rss: 68Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:07:25.541 [2024-12-02 15:23:11.119921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.541 [2024-12-02 15:23:11.119948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.119989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.541 [2024-12-02 15:23:11.120003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.120054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.541 [2024-12-02 15:23:11.120067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.120119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.541 [2024-12-02 15:23:11.120133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.541 #12 NEW cov: 11832 ft: 12506 corp: 6/456b lim: 100 exec/s: 0 rss: 68Mb L: 93/98 MS: 1 ChangeBit- 00:07:25.541 [2024-12-02 15:23:11.160050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.541 [2024-12-02 15:23:11.160076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.160119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.541 [2024-12-02 15:23:11.160133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.160183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.541 [2024-12-02 15:23:11.160198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.160249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.541 [2024-12-02 15:23:11.160263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.541 #13 NEW cov: 11832 ft: 12718 corp: 7/542b lim: 100 exec/s: 0 rss: 68Mb L: 86/98 MS: 1 ShuffleBytes- 00:07:25.541 [2024-12-02 15:23:11.200133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.541 [2024-12-02 15:23:11.200162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.200196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.541 [2024-12-02 15:23:11.200208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.200258] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.541 [2024-12-02 15:23:11.200273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.200324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.541 [2024-12-02 15:23:11.200337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.541 #14 NEW cov: 11832 ft: 12799 corp: 8/641b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 InsertByte- 00:07:25.541 [2024-12-02 15:23:11.240262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.541 [2024-12-02 15:23:11.240288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.541 [2024-12-02 15:23:11.240324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.541 [2024-12-02 15:23:11.240338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.240384] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.542 [2024-12-02 15:23:11.240398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.240446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.542 [2024-12-02 15:23:11.240460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.542 #15 NEW cov: 11832 ft: 12883 corp: 9/734b lim: 100 exec/s: 0 rss: 68Mb L: 93/99 MS: 1 ChangeByte- 00:07:25.542 [2024-12-02 15:23:11.280351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.542 [2024-12-02 15:23:11.280378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.280419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.542 [2024-12-02 15:23:11.280434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.280480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.542 [2024-12-02 15:23:11.280494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.280543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.542 [2024-12-02 15:23:11.280558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.542 #16 NEW cov: 11832 ft: 12900 corp: 10/821b lim: 100 exec/s: 0 rss: 68Mb L: 87/99 MS: 1 InsertByte- 00:07:25.542 [2024-12-02 15:23:11.310450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.542 [2024-12-02 15:23:11.310477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.310512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.542 [2024-12-02 15:23:11.310526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.310578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.542 [2024-12-02 15:23:11.310593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.310644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.542 [2024-12-02 15:23:11.310658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.542 #17 NEW cov: 11832 ft: 12929 corp: 11/919b lim: 100 exec/s: 0 rss: 68Mb L: 98/99 MS: 1 CopyPart- 00:07:25.542 [2024-12-02 15:23:11.350545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.542 [2024-12-02 15:23:11.350571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.350617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.542 [2024-12-02 15:23:11.350631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.350682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.542 [2024-12-02 15:23:11.350697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.542 [2024-12-02 15:23:11.350751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.542 [2024-12-02 15:23:11.350766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 #18 NEW cov: 11832 ft: 12945 corp: 12/1016b lim: 100 exec/s: 0 rss: 68Mb L: 97/99 MS: 1 InsertRepeatedBytes- 00:07:25.802 [2024-12-02 15:23:11.390685] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.802 [2024-12-02 15:23:11.390712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.390756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.802 [2024-12-02 15:23:11.390770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.390821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.802 [2024-12-02 15:23:11.390835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.390886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.802 [2024-12-02 15:23:11.390899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 #19 NEW cov: 11832 ft: 13047 corp: 13/1114b lim: 100 exec/s: 0 rss: 69Mb L: 98/99 MS: 1 PersAutoDict- DE: "\273\374\255:\277K\226\000"- 00:07:25.802 [2024-12-02 15:23:11.430784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.802 [2024-12-02 15:23:11.430811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.430853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.802 [2024-12-02 15:23:11.430867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.430919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.802 [2024-12-02 15:23:11.430933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.430984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.802 [2024-12-02 15:23:11.430998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 #20 NEW cov: 11832 ft: 13059 corp: 14/1199b lim: 100 exec/s: 0 rss: 69Mb L: 85/99 MS: 1 ShuffleBytes- 00:07:25.802 [2024-12-02 15:23:11.470900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.802 [2024-12-02 15:23:11.470926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.470972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.802 [2024-12-02 15:23:11.470985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.471035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.802 [2024-12-02 15:23:11.471049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.471100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.802 [2024-12-02 15:23:11.471115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 #21 NEW cov: 11832 ft: 13120 corp: 15/1292b lim: 100 exec/s: 0 rss: 69Mb L: 93/99 MS: 1 ChangeBit- 00:07:25.802 [2024-12-02 15:23:11.511063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.802 [2024-12-02 15:23:11.511089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.511123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.802 [2024-12-02 15:23:11.511137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.511186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.802 [2024-12-02 15:23:11.511201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.511252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.802 [2024-12-02 15:23:11.511265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 #22 NEW cov: 11832 ft: 13155 corp: 16/1375b lim: 100 exec/s: 0 rss: 70Mb L: 83/99 MS: 1 EraseBytes- 00:07:25.802 [2024-12-02 15:23:11.541120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.802 [2024-12-02 15:23:11.541145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.541190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.802 [2024-12-02 15:23:11.541204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.541255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.802 [2024-12-02 15:23:11.541269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.541319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.802 [2024-12-02 15:23:11.541332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 #23 NEW cov: 11832 ft: 13221 corp: 17/1462b lim: 100 exec/s: 0 rss: 70Mb L: 87/99 MS: 1 ChangeBit- 00:07:25.802 [2024-12-02 15:23:11.581283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:25.802 [2024-12-02 15:23:11.581309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.581352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:25.802 [2024-12-02 15:23:11.581367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.581416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:25.802 [2024-12-02 15:23:11.581430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.802 [2024-12-02 15:23:11.581480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:25.802 [2024-12-02 15:23:11.581494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.802 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:25.802 #24 NEW cov: 11855 ft: 13261 corp: 18/1556b lim: 100 exec/s: 0 rss: 70Mb L: 94/99 MS: 1 InsertByte- 00:07:26.062 [2024-12-02 15:23:11.621376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.621404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.621439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.621453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.621503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.062 [2024-12-02 15:23:11.621517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.621568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.062 [2024-12-02 15:23:11.621582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.062 #25 NEW cov: 11855 ft: 13291 corp: 19/1642b lim: 100 exec/s: 0 rss: 70Mb L: 86/99 MS: 1 ChangeBinInt- 00:07:26.062 [2024-12-02 15:23:11.661510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.661536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.661570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.661584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.661634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.062 [2024-12-02 15:23:11.661649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.661700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.062 [2024-12-02 15:23:11.661713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.062 #26 NEW cov: 11855 ft: 13307 corp: 20/1731b lim: 100 exec/s: 26 rss: 70Mb L: 89/99 MS: 1 InsertRepeatedBytes- 00:07:26.062 [2024-12-02 15:23:11.701615] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.701644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.701680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.701693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.701746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.062 [2024-12-02 15:23:11.701762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.701813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.062 [2024-12-02 15:23:11.701827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.062 #27 NEW cov: 11855 ft: 13342 corp: 21/1817b lim: 100 exec/s: 27 rss: 70Mb L: 86/99 MS: 1 InsertByte- 00:07:26.062 [2024-12-02 15:23:11.741578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.741604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.741640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.741653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.741704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.062 [2024-12-02 15:23:11.741718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.062 #28 NEW cov: 11855 ft: 13664 corp: 22/1877b lim: 100 exec/s: 28 rss: 70Mb L: 60/99 MS: 1 EraseBytes- 00:07:26.062 [2024-12-02 15:23:11.781825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.781851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.781893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.781907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.781957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.062 [2024-12-02 15:23:11.781970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.782020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.062 [2024-12-02 15:23:11.782033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.062 #29 NEW cov: 11855 ft: 13672 corp: 23/1967b lim: 100 exec/s: 29 rss: 70Mb L: 90/99 MS: 1 CopyPart- 00:07:26.062 [2024-12-02 15:23:11.821738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.821764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.821804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.821818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 #30 NEW cov: 11855 ft: 14026 corp: 24/2007b lim: 100 exec/s: 30 rss: 70Mb L: 40/99 MS: 1 CrossOver- 00:07:26.062 [2024-12-02 15:23:11.862047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.062 [2024-12-02 15:23:11.862074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.862117] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.062 [2024-12-02 15:23:11.862132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.862183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.062 [2024-12-02 15:23:11.862198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.062 [2024-12-02 15:23:11.862250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.062 [2024-12-02 15:23:11.862263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.323 #31 NEW cov: 11855 ft: 14046 corp: 25/2104b lim: 100 exec/s: 31 rss: 70Mb L: 97/99 MS: 1 CrossOver- 00:07:26.323 [2024-12-02 15:23:11.902183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.323 [2024-12-02 15:23:11.902210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.902251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.323 [2024-12-02 15:23:11.902264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.902314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.323 [2024-12-02 15:23:11.902328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.902378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.323 [2024-12-02 15:23:11.902391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.323 #32 NEW cov: 11855 ft: 14066 corp: 26/2190b lim: 100 exec/s: 32 rss: 70Mb L: 86/99 MS: 1 ChangeByte- 00:07:26.323 [2024-12-02 15:23:11.942270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.323 [2024-12-02 15:23:11.942296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.942340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.323 [2024-12-02 15:23:11.942354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.942404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.323 [2024-12-02 15:23:11.942419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.942469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.323 [2024-12-02 15:23:11.942482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.323 #33 NEW cov: 11855 ft: 14079 corp: 27/2287b lim: 100 exec/s: 33 rss: 70Mb L: 97/99 MS: 1 InsertRepeatedBytes- 00:07:26.323 [2024-12-02 15:23:11.982396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.323 [2024-12-02 15:23:11.982422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.982461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.323 [2024-12-02 15:23:11.982475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.982523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.323 [2024-12-02 15:23:11.982536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:11.982587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.323 [2024-12-02 15:23:11.982601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.323 #34 NEW cov: 11855 ft: 14103 corp: 28/2373b lim: 100 exec/s: 34 rss: 70Mb L: 86/99 MS: 1 ChangeBit- 00:07:26.323 [2024-12-02 15:23:12.022532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.323 [2024-12-02 15:23:12.022558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.022598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.323 [2024-12-02 15:23:12.022612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.022662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.323 [2024-12-02 15:23:12.022676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.022726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.323 [2024-12-02 15:23:12.022744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.323 #35 NEW cov: 11855 ft: 14113 corp: 29/2459b lim: 100 exec/s: 35 rss: 70Mb L: 86/99 MS: 1 ShuffleBytes- 00:07:26.323 [2024-12-02 15:23:12.062660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.323 [2024-12-02 15:23:12.062686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.062725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.323 [2024-12-02 15:23:12.062744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.062792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.323 [2024-12-02 15:23:12.062807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.062856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.323 [2024-12-02 15:23:12.062869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.092553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.323 [2024-12-02 15:23:12.092579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.323 [2024-12-02 15:23:12.092614] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.323 [2024-12-02 15:23:12.092628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.323 #37 NEW cov: 11855 ft: 14186 corp: 30/2515b lim: 100 exec/s: 37 rss: 70Mb L: 56/99 MS: 2 CrossOver-EraseBytes- 00:07:26.324 [2024-12-02 15:23:12.132846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.324 [2024-12-02 15:23:12.132875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.324 [2024-12-02 15:23:12.132909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.324 [2024-12-02 15:23:12.132922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.324 [2024-12-02 15:23:12.132970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.324 [2024-12-02 15:23:12.132985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.324 [2024-12-02 15:23:12.133036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.324 [2024-12-02 15:23:12.133049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.583 #38 NEW cov: 11855 ft: 14194 corp: 31/2613b lim: 100 exec/s: 38 rss: 70Mb L: 98/99 MS: 1 ShuffleBytes- 00:07:26.583 [2024-12-02 15:23:12.172740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.583 [2024-12-02 15:23:12.172766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.172806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.583 [2024-12-02 15:23:12.172820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.583 #39 NEW cov: 11855 ft: 14211 corp: 32/2653b lim: 100 exec/s: 39 rss: 70Mb L: 40/99 MS: 1 ChangeByte- 00:07:26.583 [2024-12-02 15:23:12.212767] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.583 [2024-12-02 15:23:12.212792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.583 #41 NEW cov: 11855 ft: 14625 corp: 33/2676b lim: 100 exec/s: 41 rss: 70Mb L: 23/99 MS: 2 CrossOver-PersAutoDict- DE: "\273\374\255:\277K\226\000"- 00:07:26.583 [2024-12-02 15:23:12.253364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.583 [2024-12-02 15:23:12.253390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.253437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.583 [2024-12-02 15:23:12.253451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.253499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.583 [2024-12-02 15:23:12.253514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.253564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.583 [2024-12-02 15:23:12.253577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.253628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:26.583 [2024-12-02 15:23:12.253642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:26.583 #42 NEW cov: 11855 ft: 14674 corp: 34/2776b lim: 100 exec/s: 42 rss: 70Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:07:26.583 [2024-12-02 15:23:12.293224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.583 [2024-12-02 15:23:12.293250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.293294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.583 [2024-12-02 15:23:12.293308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.583 [2024-12-02 15:23:12.293361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.583 [2024-12-02 15:23:12.293376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.584 #43 NEW cov: 11855 ft: 14700 corp: 35/2839b lim: 100 exec/s: 43 rss: 70Mb L: 63/100 MS: 1 CopyPart- 00:07:26.584 [2024-12-02 15:23:12.333460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.584 [2024-12-02 15:23:12.333484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.584 [2024-12-02 15:23:12.333530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.584 [2024-12-02 15:23:12.333544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.584 [2024-12-02 15:23:12.333595] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.584 [2024-12-02 15:23:12.333608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.584 [2024-12-02 15:23:12.333660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.584 [2024-12-02 15:23:12.333674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.584 #44 NEW cov: 11855 ft: 14720 corp: 36/2925b lim: 100 exec/s: 44 rss: 70Mb L: 86/100 MS: 1 ChangeBinInt- 00:07:26.584 [2024-12-02 15:23:12.363335] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.584 [2024-12-02 15:23:12.363361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.584 [2024-12-02 15:23:12.363412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.584 [2024-12-02 15:23:12.363427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.584 #45 NEW cov: 11855 ft: 14756 corp: 37/2977b lim: 100 exec/s: 45 rss: 70Mb L: 52/100 MS: 1 EraseBytes- 00:07:26.843 [2024-12-02 15:23:12.403744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.843 [2024-12-02 15:23:12.403770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.843 [2024-12-02 15:23:12.403819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.843 [2024-12-02 15:23:12.403839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.843 [2024-12-02 15:23:12.403892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.843 [2024-12-02 15:23:12.403906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.843 [2024-12-02 15:23:12.403958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.843 [2024-12-02 15:23:12.403972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.843 #46 NEW cov: 11855 ft: 14789 corp: 38/3071b lim: 100 exec/s: 46 rss: 70Mb L: 94/100 MS: 1 PersAutoDict- DE: "\273\374\255:\277K\226\000"- 00:07:26.843 [2024-12-02 15:23:12.443814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.843 [2024-12-02 15:23:12.443843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.843 [2024-12-02 15:23:12.443877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.843 [2024-12-02 15:23:12.443891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.843 [2024-12-02 15:23:12.443939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.844 [2024-12-02 15:23:12.443953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.444005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.844 [2024-12-02 15:23:12.444019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.844 #47 NEW cov: 11855 ft: 14795 corp: 39/3168b lim: 100 exec/s: 47 rss: 70Mb L: 97/100 MS: 1 ShuffleBytes- 00:07:26.844 [2024-12-02 15:23:12.483805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.844 [2024-12-02 15:23:12.483831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.483872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.844 [2024-12-02 15:23:12.483886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.483937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.844 [2024-12-02 15:23:12.483950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.844 #48 NEW cov: 11855 ft: 14798 corp: 40/3245b lim: 100 exec/s: 48 rss: 70Mb L: 77/100 MS: 1 InsertRepeatedBytes- 00:07:26.844 [2024-12-02 15:23:12.524017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.844 [2024-12-02 15:23:12.524043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.524084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.844 [2024-12-02 15:23:12.524098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.524150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.844 [2024-12-02 15:23:12.524164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.524215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.844 [2024-12-02 15:23:12.524229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.844 #49 NEW cov: 11855 ft: 14804 corp: 41/3343b lim: 100 exec/s: 49 rss: 70Mb L: 98/100 MS: 1 ChangeBinInt- 00:07:26.844 [2024-12-02 15:23:12.564019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.844 [2024-12-02 15:23:12.564044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.564081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.844 [2024-12-02 15:23:12.564095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.564148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.844 [2024-12-02 15:23:12.564169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.844 #50 NEW cov: 11855 ft: 14816 corp: 42/3403b lim: 100 exec/s: 50 rss: 70Mb L: 60/100 MS: 1 ShuffleBytes- 00:07:26.844 [2024-12-02 15:23:12.604223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.844 [2024-12-02 15:23:12.604248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.604293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.844 [2024-12-02 15:23:12.604307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.604359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:26.844 [2024-12-02 15:23:12.604373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.604427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:26.844 [2024-12-02 15:23:12.604441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.844 #51 NEW cov: 11855 ft: 14864 corp: 43/3489b lim: 100 exec/s: 51 rss: 70Mb L: 86/100 MS: 1 ShuffleBytes- 00:07:26.844 [2024-12-02 15:23:12.644147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:26.844 [2024-12-02 15:23:12.644173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.844 [2024-12-02 15:23:12.644226] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:26.844 [2024-12-02 15:23:12.644239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.116 #52 NEW cov: 11855 ft: 14875 corp: 44/3529b lim: 100 exec/s: 52 rss: 70Mb L: 40/100 MS: 1 ShuffleBytes- 00:07:27.116 [2024-12-02 15:23:12.684452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:27.116 [2024-12-02 15:23:12.684482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.116 [2024-12-02 15:23:12.684526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:27.116 [2024-12-02 15:23:12.684544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.116 [2024-12-02 15:23:12.684598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:27.116 [2024-12-02 15:23:12.684616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.116 [2024-12-02 15:23:12.684671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:27.116 [2024-12-02 15:23:12.684686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.116 #53 NEW cov: 11855 ft: 14887 corp: 45/3612b lim: 100 exec/s: 26 rss: 70Mb L: 83/100 MS: 1 ChangeByte- 00:07:27.116 #53 DONE cov: 11855 ft: 14887 corp: 45/3612b lim: 100 exec/s: 26 rss: 70Mb 00:07:27.116 ###### Recommended dictionary. ###### 00:07:27.116 "\273\374\255:\277K\226\000" # Uses: 3 00:07:27.116 ###### End of recommended dictionary. ###### 00:07:27.116 Done 53 runs in 2 second(s) 00:07:27.116 15:23:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:27.116 15:23:12 -- ../common.sh@72 -- # (( i++ )) 00:07:27.116 15:23:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:27.116 15:23:12 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:27.116 15:23:12 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:27.116 15:23:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:27.116 15:23:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:27.116 15:23:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:27.116 15:23:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:27.116 15:23:12 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:27.116 15:23:12 -- nvmf/run.sh@29 -- # port=4419 00:07:27.116 15:23:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:27.116 15:23:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:27.116 15:23:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:27.116 15:23:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:27.116 [2024-12-02 15:23:12.866343] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:27.116 [2024-12-02 15:23:12.866413] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603991 ] 00:07:27.116 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.415 [2024-12-02 15:23:13.046682] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.415 [2024-12-02 15:23:13.108955] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:27.415 [2024-12-02 15:23:13.109077] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.415 [2024-12-02 15:23:13.167094] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:27.415 [2024-12-02 15:23:13.183409] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:27.415 INFO: Running with entropic power schedule (0xFF, 100). 00:07:27.415 INFO: Seed: 3295117401 00:07:27.415 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:27.415 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:27.415 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:27.415 INFO: A corpus is not provided, starting from an empty corpus 00:07:27.415 #2 INITED exec/s: 0 rss: 61Mb 00:07:27.415 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:27.415 This may also happen if the target rejected all inputs we tried so far 00:07:27.735 [2024-12-02 15:23:13.238467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:27.735 [2024-12-02 15:23:13.238500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.735 NEW_FUNC[1/670]: 0x45ae18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:27.735 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.735 #6 NEW cov: 11606 ft: 11592 corp: 2/20b lim: 50 exec/s: 0 rss: 68Mb L: 19/19 MS: 4 ChangeBit-CopyPart-CopyPart-InsertRepeatedBytes- 00:07:27.735 [2024-12-02 15:23:13.539488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578473983164281843 len:62452 00:07:27.735 [2024-12-02 15:23:13.539548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.735 [2024-12-02 15:23:13.539626] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:62283 00:07:27.735 [2024-12-02 15:23:13.539658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.994 #7 NEW cov: 11719 ft: 12634 corp: 3/40b lim: 50 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertByte- 00:07:27.994 [2024-12-02 15:23:13.589280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661995743540211 len:62452 00:07:27.994 [2024-12-02 15:23:13.589309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.994 #8 NEW cov: 11725 ft: 12866 corp: 4/51b lim: 50 exec/s: 0 rss: 68Mb L: 11/20 MS: 1 CrossOver- 00:07:27.994 [2024-12-02 15:23:13.629511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578473983164281827 len:62452 00:07:27.994 [2024-12-02 15:23:13.629539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.994 [2024-12-02 15:23:13.629593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:62283 00:07:27.994 [2024-12-02 15:23:13.629610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.994 #9 NEW cov: 11810 ft: 13135 corp: 5/71b lim: 50 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 ChangeBit- 00:07:27.994 [2024-12-02 15:23:13.669513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:27.994 [2024-12-02 15:23:13.669542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.994 #10 NEW cov: 11810 ft: 13250 corp: 6/90b lim: 50 exec/s: 0 rss: 68Mb L: 19/20 MS: 1 CrossOver- 00:07:27.994 [2024-12-02 15:23:13.709769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999642801139 len:62452 00:07:27.994 [2024-12-02 15:23:13.709799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.994 [2024-12-02 15:23:13.709851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:62283 00:07:27.994 [2024-12-02 15:23:13.709867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.994 #11 NEW cov: 11810 ft: 13297 corp: 7/110b lim: 50 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertByte- 00:07:27.994 [2024-12-02 15:23:13.749742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4060282880 len:19008 00:07:27.994 [2024-12-02 15:23:13.749770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.994 #16 NEW cov: 11810 ft: 13417 corp: 8/120b lim: 50 exec/s: 0 rss: 68Mb L: 10/20 MS: 5 InsertByte-ChangeByte-ChangeBit-ChangeBit-CMP- DE: "\362\003\000\000\000\000\000\000"- 00:07:27.994 [2024-12-02 15:23:13.789872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12876620382302548658 len:1 00:07:27.994 [2024-12-02 15:23:13.789900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.995 #24 NEW cov: 11810 ft: 13479 corp: 9/135b lim: 50 exec/s: 0 rss: 68Mb L: 15/20 MS: 3 ShuffleBytes-PersAutoDict-InsertRepeatedBytes- DE: "\362\003\000\000\000\000\000\000"- 00:07:28.254 [2024-12-02 15:23:13.820198] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.254 [2024-12-02 15:23:13.820227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.820265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:28.254 [2024-12-02 15:23:13.820280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.820334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14106333703424951235 len:50116 00:07:28.254 [2024-12-02 15:23:13.820352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.254 #25 NEW cov: 11810 ft: 13849 corp: 10/173b lim: 50 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:28.254 [2024-12-02 15:23:13.860678] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.254 [2024-12-02 15:23:13.860705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.860753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:28.254 [2024-12-02 15:23:13.860786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.860840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17565098217248650227 len:50116 00:07:28.254 [2024-12-02 15:23:13.860856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.860909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14106333703424951235 len:50116 00:07:28.254 [2024-12-02 15:23:13.860934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.254 #26 NEW cov: 11810 ft: 14184 corp: 11/222b lim: 50 exec/s: 0 rss: 69Mb L: 49/49 MS: 1 CopyPart- 00:07:28.254 [2024-12-02 15:23:13.900528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.254 [2024-12-02 15:23:13.900557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.900592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:28.254 [2024-12-02 15:23:13.900607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.900659] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14106332870201295811 len:1 00:07:28.254 [2024-12-02 15:23:13.900674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:13.900726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14106386682869957571 len:62452 00:07:28.254 [2024-12-02 15:23:13.900745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.254 #27 NEW cov: 11810 ft: 14214 corp: 12/264b lim: 50 exec/s: 0 rss: 69Mb L: 42/49 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:28.254 [2024-12-02 15:23:13.940340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.254 [2024-12-02 15:23:13.940368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 #28 NEW cov: 11810 ft: 14223 corp: 13/283b lim: 50 exec/s: 0 rss: 69Mb L: 19/49 MS: 1 ShuffleBytes- 00:07:28.254 [2024-12-02 15:23:13.970431] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12876620382268994226 len:1 00:07:28.254 [2024-12-02 15:23:13.970460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 #29 NEW cov: 11810 ft: 14242 corp: 14/298b lim: 50 exec/s: 0 rss: 69Mb L: 15/49 MS: 1 ChangeBit- 00:07:28.254 [2024-12-02 15:23:14.010525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8536140393089332211 len:62452 00:07:28.254 [2024-12-02 15:23:14.010553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 #30 NEW cov: 11810 ft: 14279 corp: 15/313b lim: 50 exec/s: 0 rss: 69Mb L: 15/49 MS: 1 InsertRepeatedBytes- 00:07:28.254 [2024-12-02 15:23:14.050750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4092723968 len:94 00:07:28.254 [2024-12-02 15:23:14.050778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.254 [2024-12-02 15:23:14.050813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:62452 00:07:28.254 [2024-12-02 15:23:14.050829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.514 #31 NEW cov: 11810 ft: 14377 corp: 16/341b lim: 50 exec/s: 0 rss: 69Mb L: 28/49 MS: 1 PersAutoDict- DE: "\362\003\000\000\000\000\000\000"- 00:07:28.514 [2024-12-02 15:23:14.090785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12876620382268994226 len:1 00:07:28.514 [2024-12-02 15:23:14.090813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.514 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:28.514 #32 NEW cov: 11833 ft: 14402 corp: 17/357b lim: 50 exec/s: 0 rss: 69Mb L: 16/49 MS: 1 InsertByte- 00:07:28.514 [2024-12-02 15:23:14.130959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999642801139 len:62452 00:07:28.514 [2024-12-02 15:23:14.130987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.514 [2024-12-02 15:23:14.131025] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:62283 00:07:28.514 [2024-12-02 15:23:14.131039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.514 #33 NEW cov: 11833 ft: 14413 corp: 18/377b lim: 50 exec/s: 0 rss: 69Mb L: 20/49 MS: 1 CopyPart- 00:07:28.514 [2024-12-02 15:23:14.171098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4092723968 len:94 00:07:28.514 [2024-12-02 15:23:14.171126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.514 [2024-12-02 15:23:14.171182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:62452 00:07:28.514 [2024-12-02 15:23:14.171197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.514 #34 NEW cov: 11833 ft: 14449 corp: 19/404b lim: 50 exec/s: 0 rss: 69Mb L: 27/49 MS: 1 EraseBytes- 00:07:28.514 [2024-12-02 15:23:14.211104] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62210 00:07:28.514 [2024-12-02 15:23:14.211133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.514 #35 NEW cov: 11833 ft: 14453 corp: 20/423b lim: 50 exec/s: 35 rss: 69Mb L: 19/49 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:28.514 [2024-12-02 15:23:14.241202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.514 [2024-12-02 15:23:14.241231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.514 #36 NEW cov: 11833 ft: 14469 corp: 21/442b lim: 50 exec/s: 36 rss: 69Mb L: 19/49 MS: 1 ChangeBit- 00:07:28.514 [2024-12-02 15:23:14.281418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578473983164281843 len:62452 00:07:28.514 [2024-12-02 15:23:14.281448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.514 [2024-12-02 15:23:14.281500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631537 len:62283 00:07:28.514 [2024-12-02 15:23:14.281516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.514 #37 NEW cov: 11833 ft: 14475 corp: 22/462b lim: 50 exec/s: 37 rss: 69Mb L: 20/49 MS: 1 ChangeBit- 00:07:28.514 [2024-12-02 15:23:14.321419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661995810649075 len:62452 00:07:28.514 [2024-12-02 15:23:14.321447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.773 #38 NEW cov: 11833 ft: 14574 corp: 23/473b lim: 50 exec/s: 38 rss: 69Mb L: 11/49 MS: 1 ChangeBit- 00:07:28.773 [2024-12-02 15:23:14.361522] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999620656131 len:62452 00:07:28.774 [2024-12-02 15:23:14.361551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 #39 NEW cov: 11833 ft: 14598 corp: 24/484b lim: 50 exec/s: 39 rss: 70Mb L: 11/49 MS: 1 ChangeBinInt- 00:07:28.774 [2024-12-02 15:23:14.401742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4092723968 len:164 00:07:28.774 [2024-12-02 15:23:14.401769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.401804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661995928089587 len:62452 00:07:28.774 [2024-12-02 15:23:14.401821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.774 #40 NEW cov: 11833 ft: 14689 corp: 25/512b lim: 50 exec/s: 40 rss: 70Mb L: 28/49 MS: 1 ChangeBinInt- 00:07:28.774 [2024-12-02 15:23:14.441778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8571462740994324086 len:62452 00:07:28.774 [2024-12-02 15:23:14.441805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 #41 NEW cov: 11833 ft: 14708 corp: 26/524b lim: 50 exec/s: 41 rss: 70Mb L: 12/49 MS: 1 EraseBytes- 00:07:28.774 [2024-12-02 15:23:14.482329] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.774 [2024-12-02 15:23:14.482358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.482403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:28.774 [2024-12-02 15:23:14.482418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.482471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17565098217248650227 len:50116 00:07:28.774 [2024-12-02 15:23:14.482487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.482540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14106333703424951235 len:62404 00:07:28.774 [2024-12-02 15:23:14.482556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.482612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:17578661998844167107 len:62283 00:07:28.774 [2024-12-02 15:23:14.482627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:28.774 #42 NEW cov: 11833 ft: 14749 corp: 27/574b lim: 50 exec/s: 42 rss: 70Mb L: 50/50 MS: 1 CopyPart- 00:07:28.774 [2024-12-02 15:23:14.521994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578659869348852723 len:1 00:07:28.774 [2024-12-02 15:23:14.522023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 #43 NEW cov: 11833 ft: 14756 corp: 28/593b lim: 50 exec/s: 43 rss: 70Mb L: 19/50 MS: 1 PersAutoDict- DE: "\362\003\000\000\000\000\000\000"- 00:07:28.774 [2024-12-02 15:23:14.552323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.774 [2024-12-02 15:23:14.552351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.552387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:28.774 [2024-12-02 15:23:14.552403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.552455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14123288430423294915 len:65536 00:07:28.774 [2024-12-02 15:23:14.552470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.774 #44 NEW cov: 11833 ft: 14765 corp: 29/631b lim: 50 exec/s: 44 rss: 70Mb L: 38/50 MS: 1 CMP- DE: "\377\377\377\377\377\377\377>"- 00:07:28.774 [2024-12-02 15:23:14.592437] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:28.774 [2024-12-02 15:23:14.592465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.592501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:28.774 [2024-12-02 15:23:14.592516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.774 [2024-12-02 15:23:14.592573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14106333703424951235 len:50116 00:07:28.774 [2024-12-02 15:23:14.592589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.033 #45 NEW cov: 11833 ft: 14795 corp: 30/669b lim: 50 exec/s: 45 rss: 70Mb L: 38/50 MS: 1 ChangeBinInt- 00:07:29.033 [2024-12-02 15:23:14.632317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8571324765178164211 len:62452 00:07:29.033 [2024-12-02 15:23:14.632345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.033 #46 NEW cov: 11833 ft: 14818 corp: 31/684b lim: 50 exec/s: 46 rss: 70Mb L: 15/50 MS: 1 ShuffleBytes- 00:07:29.033 [2024-12-02 15:23:14.672768] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:29.033 [2024-12-02 15:23:14.672796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.672832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50164 00:07:29.033 [2024-12-02 15:23:14.672847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.672902] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17565098218050798579 len:50116 00:07:29.033 [2024-12-02 15:23:14.672918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.672971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14106333703424951235 len:50116 00:07:29.033 [2024-12-02 15:23:14.672986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.033 #47 NEW cov: 11833 ft: 14895 corp: 32/733b lim: 50 exec/s: 47 rss: 70Mb L: 49/50 MS: 1 ShuffleBytes- 00:07:29.033 [2024-12-02 15:23:14.712649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578473983164281843 len:62452 00:07:29.033 [2024-12-02 15:23:14.712677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.712727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661999652631539 len:11595 00:07:29.033 [2024-12-02 15:23:14.712749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.033 #48 NEW cov: 11833 ft: 14898 corp: 33/753b lim: 50 exec/s: 48 rss: 70Mb L: 20/50 MS: 1 ChangeByte- 00:07:29.033 [2024-12-02 15:23:14.752776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4092723968 len:62452 00:07:29.033 [2024-12-02 15:23:14.752805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.752854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578661995575729651 len:62452 00:07:29.033 [2024-12-02 15:23:14.752871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.033 #49 NEW cov: 11833 ft: 14910 corp: 34/780b lim: 50 exec/s: 49 rss: 70Mb L: 27/50 MS: 1 ShuffleBytes- 00:07:29.033 [2024-12-02 15:23:14.793132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:29.033 [2024-12-02 15:23:14.793161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.793202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:29.033 [2024-12-02 15:23:14.793217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.793271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17565098217248650227 len:50116 00:07:29.033 [2024-12-02 15:23:14.793286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.793341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14106333703424951235 len:50116 00:07:29.033 [2024-12-02 15:23:14.793357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.033 #50 NEW cov: 11833 ft: 14918 corp: 35/829b lim: 50 exec/s: 50 rss: 70Mb L: 49/50 MS: 1 ChangeBit- 00:07:29.033 [2024-12-02 15:23:14.833021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578473983164281827 len:50116 00:07:29.033 [2024-12-02 15:23:14.833049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.033 [2024-12-02 15:23:14.833103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333961122988995 len:65536 00:07:29.033 [2024-12-02 15:23:14.833121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.292 #51 NEW cov: 11833 ft: 14928 corp: 36/849b lim: 50 exec/s: 51 rss: 70Mb L: 20/50 MS: 1 CrossOver- 00:07:29.292 [2024-12-02 15:23:14.873500] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:29.292 [2024-12-02 15:23:14.873527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.873569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333704233415619 len:50116 00:07:29.292 [2024-12-02 15:23:14.873585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.873638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17565098217248650227 len:50116 00:07:29.292 [2024-12-02 15:23:14.873654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.873709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14106333703424951235 len:62404 00:07:29.292 [2024-12-02 15:23:14.873725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.873790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:17578661998844167107 len:62283 00:07:29.292 [2024-12-02 15:23:14.873807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:29.292 #52 NEW cov: 11833 ft: 14936 corp: 37/899b lim: 50 exec/s: 52 rss: 70Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:29.292 [2024-12-02 15:23:14.913370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:29.292 [2024-12-02 15:23:14.913398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.913434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333128707797955 len:15421 00:07:29.292 [2024-12-02 15:23:14.913450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.913503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14106333701059363779 len:50116 00:07:29.292 [2024-12-02 15:23:14.913519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.292 #53 NEW cov: 11833 ft: 14948 corp: 38/937b lim: 50 exec/s: 53 rss: 70Mb L: 38/50 MS: 1 ChangeBinInt- 00:07:29.292 [2024-12-02 15:23:14.953279] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12876620382268994226 len:1 00:07:29.292 [2024-12-02 15:23:14.953308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.292 #54 NEW cov: 11833 ft: 14956 corp: 39/953b lim: 50 exec/s: 54 rss: 70Mb L: 16/50 MS: 1 InsertByte- 00:07:29.292 [2024-12-02 15:23:14.993613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999652631539 len:62452 00:07:29.292 [2024-12-02 15:23:14.993641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.993677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14106333128707797955 len:15421 00:07:29.292 [2024-12-02 15:23:14.993696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:14.993752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14106333700992254915 len:50116 00:07:29.292 [2024-12-02 15:23:14.993768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.292 #55 NEW cov: 11833 ft: 14981 corp: 40/991b lim: 50 exec/s: 55 rss: 70Mb L: 38/50 MS: 1 ChangeASCIIInt- 00:07:29.292 [2024-12-02 15:23:15.033707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4092723968 len:94 00:07:29.292 [2024-12-02 15:23:15.033738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:15.033781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:268229095391488 len:62452 00:07:29.292 [2024-12-02 15:23:15.033796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.292 [2024-12-02 15:23:15.033848] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17578661999652631539 len:62452 00:07:29.292 [2024-12-02 15:23:15.033864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.292 #56 NEW cov: 11833 ft: 14988 corp: 41/1022b lim: 50 exec/s: 56 rss: 70Mb L: 31/50 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:29.292 [2024-12-02 15:23:15.073608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17488236783604990898 len:1 00:07:29.292 [2024-12-02 15:23:15.073636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.292 #57 NEW cov: 11833 ft: 14989 corp: 42/1037b lim: 50 exec/s: 57 rss: 70Mb L: 15/50 MS: 1 ShuffleBytes- 00:07:29.551 [2024-12-02 15:23:15.113997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:17578661999642801139 len:62452 00:07:29.551 [2024-12-02 15:23:15.114024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.551 [2024-12-02 15:23:15.114058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17578662051192239091 len:65536 00:07:29.551 [2024-12-02 15:23:15.114074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.551 [2024-12-02 15:23:15.114126] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:29.551 [2024-12-02 15:23:15.114142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.551 #58 NEW cov: 11833 ft: 15017 corp: 43/1074b lim: 50 exec/s: 58 rss: 70Mb L: 37/50 MS: 1 InsertRepeatedBytes- 00:07:29.551 [2024-12-02 15:23:15.153876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12876620383582683135 len:1 00:07:29.551 [2024-12-02 15:23:15.153903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.551 #59 NEW cov: 11833 ft: 15044 corp: 44/1089b lim: 50 exec/s: 59 rss: 70Mb L: 15/50 MS: 1 CMP- DE: "\376\377\377\377"- 00:07:29.551 [2024-12-02 15:23:15.194068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12898309331458634418 len:65536 00:07:29.551 [2024-12-02 15:23:15.194095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.551 [2024-12-02 15:23:15.194134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:17438782186403612594 len:1 00:07:29.551 [2024-12-02 15:23:15.194154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.551 #60 NEW cov: 11833 ft: 15050 corp: 45/1113b lim: 50 exec/s: 60 rss: 70Mb L: 24/50 MS: 1 CMP- DE: "\377\377\377\377\377\377\377G"- 00:07:29.551 [2024-12-02 15:23:15.234312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4092723968 len:94 00:07:29.551 [2024-12-02 15:23:15.234340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.551 [2024-12-02 15:23:15.234375] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:267726584217856 len:62452 00:07:29.551 [2024-12-02 15:23:15.234391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.551 [2024-12-02 15:23:15.234442] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17578661999652631539 len:62452 00:07:29.551 [2024-12-02 15:23:15.234458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.551 #61 NEW cov: 11833 ft: 15051 corp: 46/1144b lim: 50 exec/s: 30 rss: 70Mb L: 31/50 MS: 1 ChangeByte- 00:07:29.551 #61 DONE cov: 11833 ft: 15051 corp: 46/1144b lim: 50 exec/s: 30 rss: 70Mb 00:07:29.551 ###### Recommended dictionary. ###### 00:07:29.551 "\362\003\000\000\000\000\000\000" # Uses: 3 00:07:29.551 "\001\000\000\000" # Uses: 2 00:07:29.551 "\377\377\377\377\377\377\377>" # Uses: 0 00:07:29.551 "\376\377\377\377" # Uses: 0 00:07:29.551 "\377\377\377\377\377\377\377G" # Uses: 0 00:07:29.551 ###### End of recommended dictionary. ###### 00:07:29.551 Done 61 runs in 2 second(s) 00:07:29.809 15:23:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:29.809 15:23:15 -- ../common.sh@72 -- # (( i++ )) 00:07:29.809 15:23:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:29.810 15:23:15 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:29.810 15:23:15 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:29.810 15:23:15 -- nvmf/run.sh@24 -- # local timen=1 00:07:29.810 15:23:15 -- nvmf/run.sh@25 -- # local core=0x1 00:07:29.810 15:23:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:29.810 15:23:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:29.810 15:23:15 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:29.810 15:23:15 -- nvmf/run.sh@29 -- # port=4420 00:07:29.810 15:23:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:29.810 15:23:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:29.810 15:23:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:29.810 15:23:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:29.810 [2024-12-02 15:23:15.422839] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:29.810 [2024-12-02 15:23:15.422930] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1604452 ] 00:07:29.810 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.810 [2024-12-02 15:23:15.600153] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.067 [2024-12-02 15:23:15.662805] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:30.067 [2024-12-02 15:23:15.662926] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.067 [2024-12-02 15:23:15.720815] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:30.067 [2024-12-02 15:23:15.737161] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:30.067 INFO: Running with entropic power schedule (0xFF, 100). 00:07:30.067 INFO: Seed: 1552142999 00:07:30.067 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:30.067 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:30.067 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:30.067 INFO: A corpus is not provided, starting from an empty corpus 00:07:30.067 #2 INITED exec/s: 0 rss: 60Mb 00:07:30.067 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:30.068 This may also happen if the target rejected all inputs we tried so far 00:07:30.068 [2024-12-02 15:23:15.807298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.068 [2024-12-02 15:23:15.807332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.068 [2024-12-02 15:23:15.807453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.068 [2024-12-02 15:23:15.807476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.068 [2024-12-02 15:23:15.807605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.068 [2024-12-02 15:23:15.807630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.325 NEW_FUNC[1/672]: 0x45c9d8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:30.325 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:30.325 #3 NEW cov: 11646 ft: 11662 corp: 2/70b lim: 90 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:07:30.325 [2024-12-02 15:23:16.138412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.325 [2024-12-02 15:23:16.138468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.325 [2024-12-02 15:23:16.138616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.325 [2024-12-02 15:23:16.138657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.325 [2024-12-02 15:23:16.138810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.325 [2024-12-02 15:23:16.138844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.584 #4 NEW cov: 11777 ft: 12164 corp: 3/139b lim: 90 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 ChangeBit- 00:07:30.584 [2024-12-02 15:23:16.198211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.584 [2024-12-02 15:23:16.198247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.198379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.584 [2024-12-02 15:23:16.198404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.584 #5 NEW cov: 11783 ft: 12773 corp: 4/181b lim: 90 exec/s: 0 rss: 68Mb L: 42/69 MS: 1 CrossOver- 00:07:30.584 [2024-12-02 15:23:16.258650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.584 [2024-12-02 15:23:16.258685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.258825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.584 [2024-12-02 15:23:16.258849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.258984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.584 [2024-12-02 15:23:16.259006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.584 #11 NEW cov: 11868 ft: 13035 corp: 5/250b lim: 90 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 CopyPart- 00:07:30.584 [2024-12-02 15:23:16.308793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.584 [2024-12-02 15:23:16.308826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.308932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.584 [2024-12-02 15:23:16.308950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.309085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.584 [2024-12-02 15:23:16.309107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.584 #12 NEW cov: 11868 ft: 13129 corp: 6/319b lim: 90 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 ChangeBinInt- 00:07:30.584 [2024-12-02 15:23:16.359098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.584 [2024-12-02 15:23:16.359132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.359275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.584 [2024-12-02 15:23:16.359296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.584 [2024-12-02 15:23:16.359427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.584 [2024-12-02 15:23:16.359448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.584 #13 NEW cov: 11868 ft: 13214 corp: 7/388b lim: 90 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:30.843 [2024-12-02 15:23:16.409316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.843 [2024-12-02 15:23:16.409349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.409491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.843 [2024-12-02 15:23:16.409518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.409654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.843 [2024-12-02 15:23:16.409676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.843 #17 NEW cov: 11868 ft: 13313 corp: 8/459b lim: 90 exec/s: 0 rss: 68Mb L: 71/71 MS: 4 CopyPart-ChangeBinInt-ChangeBit-CrossOver- 00:07:30.843 [2024-12-02 15:23:16.459477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.843 [2024-12-02 15:23:16.459509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.459636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.843 [2024-12-02 15:23:16.459658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.459798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.843 [2024-12-02 15:23:16.459822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.843 #18 NEW cov: 11868 ft: 13328 corp: 9/528b lim: 90 exec/s: 0 rss: 68Mb L: 69/71 MS: 1 ChangeBit- 00:07:30.843 [2024-12-02 15:23:16.509579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.843 [2024-12-02 15:23:16.509612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.509743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.843 [2024-12-02 15:23:16.509765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.509896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.843 [2024-12-02 15:23:16.509917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.843 #19 NEW cov: 11868 ft: 13342 corp: 10/597b lim: 90 exec/s: 0 rss: 68Mb L: 69/71 MS: 1 ChangeBit- 00:07:30.843 [2024-12-02 15:23:16.559776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.843 [2024-12-02 15:23:16.559809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.559953] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.843 [2024-12-02 15:23:16.559974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.560100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.843 [2024-12-02 15:23:16.560125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.843 #20 NEW cov: 11868 ft: 13395 corp: 11/666b lim: 90 exec/s: 0 rss: 68Mb L: 69/71 MS: 1 ShuffleBytes- 00:07:30.843 [2024-12-02 15:23:16.609987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.843 [2024-12-02 15:23:16.610020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.610160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.843 [2024-12-02 15:23:16.610181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.610309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.843 [2024-12-02 15:23:16.610332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.843 #21 NEW cov: 11868 ft: 13409 corp: 12/736b lim: 90 exec/s: 0 rss: 68Mb L: 70/71 MS: 1 InsertByte- 00:07:30.843 [2024-12-02 15:23:16.660412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:30.843 [2024-12-02 15:23:16.660444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.660545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:30.843 [2024-12-02 15:23:16.660570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.660723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:30.843 [2024-12-02 15:23:16.660750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.843 [2024-12-02 15:23:16.660880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:30.843 [2024-12-02 15:23:16.660903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.101 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:31.101 #22 NEW cov: 11891 ft: 13774 corp: 13/813b lim: 90 exec/s: 0 rss: 69Mb L: 77/77 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:31.101 [2024-12-02 15:23:16.720274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.101 [2024-12-02 15:23:16.720306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.101 [2024-12-02 15:23:16.720446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.101 [2024-12-02 15:23:16.720471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.101 [2024-12-02 15:23:16.720605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.101 [2024-12-02 15:23:16.720627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.102 #23 NEW cov: 11891 ft: 13807 corp: 14/882b lim: 90 exec/s: 0 rss: 69Mb L: 69/77 MS: 1 ChangeByte- 00:07:31.102 [2024-12-02 15:23:16.770507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.102 [2024-12-02 15:23:16.770540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.102 [2024-12-02 15:23:16.770673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.102 [2024-12-02 15:23:16.770695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.102 [2024-12-02 15:23:16.770837] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.102 [2024-12-02 15:23:16.770860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.102 #24 NEW cov: 11891 ft: 13822 corp: 15/951b lim: 90 exec/s: 24 rss: 69Mb L: 69/77 MS: 1 CMP- DE: "\024\347\305\033\303K\226\000"- 00:07:31.102 [2024-12-02 15:23:16.830664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.102 [2024-12-02 15:23:16.830698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.102 [2024-12-02 15:23:16.830807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.102 [2024-12-02 15:23:16.830822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.102 [2024-12-02 15:23:16.830964] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.102 [2024-12-02 15:23:16.830986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.102 #25 NEW cov: 11891 ft: 13868 corp: 16/1020b lim: 90 exec/s: 25 rss: 69Mb L: 69/77 MS: 1 ChangeBit- 00:07:31.102 [2024-12-02 15:23:16.880756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.102 [2024-12-02 15:23:16.880801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.102 [2024-12-02 15:23:16.880946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.102 [2024-12-02 15:23:16.880969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.102 [2024-12-02 15:23:16.881110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.102 [2024-12-02 15:23:16.881134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.102 #26 NEW cov: 11891 ft: 13962 corp: 17/1079b lim: 90 exec/s: 26 rss: 69Mb L: 59/77 MS: 1 CopyPart- 00:07:31.361 [2024-12-02 15:23:16.941292] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.361 [2024-12-02 15:23:16.941327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:16.941485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.361 [2024-12-02 15:23:16.941510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:16.941644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.361 [2024-12-02 15:23:16.941669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:16.941804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.361 [2024-12-02 15:23:16.941826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.361 #27 NEW cov: 11891 ft: 13977 corp: 18/1156b lim: 90 exec/s: 27 rss: 69Mb L: 77/77 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:31.361 [2024-12-02 15:23:17.001462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.361 [2024-12-02 15:23:17.001495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.001652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.361 [2024-12-02 15:23:17.001672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.001812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.361 [2024-12-02 15:23:17.001833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.001980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.361 [2024-12-02 15:23:17.001999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.361 #28 NEW cov: 11891 ft: 14013 corp: 19/1233b lim: 90 exec/s: 28 rss: 69Mb L: 77/77 MS: 1 PersAutoDict- DE: "\024\347\305\033\303K\226\000"- 00:07:31.361 [2024-12-02 15:23:17.061433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.361 [2024-12-02 15:23:17.061465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.061592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.361 [2024-12-02 15:23:17.061617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.061752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.361 [2024-12-02 15:23:17.061780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.361 #29 NEW cov: 11891 ft: 14044 corp: 20/1296b lim: 90 exec/s: 29 rss: 69Mb L: 63/77 MS: 1 EraseBytes- 00:07:31.361 [2024-12-02 15:23:17.111532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.361 [2024-12-02 15:23:17.111564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.111677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.361 [2024-12-02 15:23:17.111699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.111836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.361 [2024-12-02 15:23:17.111861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.361 #30 NEW cov: 11891 ft: 14059 corp: 21/1365b lim: 90 exec/s: 30 rss: 69Mb L: 69/77 MS: 1 ChangeBit- 00:07:31.361 [2024-12-02 15:23:17.161748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.361 [2024-12-02 15:23:17.161782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.161909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.361 [2024-12-02 15:23:17.161929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.361 [2024-12-02 15:23:17.162070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.361 [2024-12-02 15:23:17.162095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.361 #31 NEW cov: 11891 ft: 14073 corp: 22/1426b lim: 90 exec/s: 31 rss: 69Mb L: 61/77 MS: 1 EraseBytes- 00:07:31.621 [2024-12-02 15:23:17.212097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.621 [2024-12-02 15:23:17.212128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.212234] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.621 [2024-12-02 15:23:17.212260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.212392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.621 [2024-12-02 15:23:17.212411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.212548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.621 [2024-12-02 15:23:17.212566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.621 #32 NEW cov: 11891 ft: 14082 corp: 23/1503b lim: 90 exec/s: 32 rss: 69Mb L: 77/77 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:31.621 [2024-12-02 15:23:17.272167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.621 [2024-12-02 15:23:17.272201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.272329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.621 [2024-12-02 15:23:17.272358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.272492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.621 [2024-12-02 15:23:17.272515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.621 #33 NEW cov: 11891 ft: 14104 corp: 24/1572b lim: 90 exec/s: 33 rss: 69Mb L: 69/77 MS: 1 ChangeBinInt- 00:07:31.621 [2024-12-02 15:23:17.332452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.621 [2024-12-02 15:23:17.332497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.332625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.621 [2024-12-02 15:23:17.332650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.332777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.621 [2024-12-02 15:23:17.332800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.332942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.621 [2024-12-02 15:23:17.332965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.621 #34 NEW cov: 11891 ft: 14110 corp: 25/1649b lim: 90 exec/s: 34 rss: 69Mb L: 77/77 MS: 1 ChangeByte- 00:07:31.621 [2024-12-02 15:23:17.392502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.621 [2024-12-02 15:23:17.392534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.392668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.621 [2024-12-02 15:23:17.392691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.621 [2024-12-02 15:23:17.392822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.621 [2024-12-02 15:23:17.392843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.621 #35 NEW cov: 11891 ft: 14192 corp: 26/1718b lim: 90 exec/s: 35 rss: 70Mb L: 69/77 MS: 1 ChangeBit- 00:07:31.881 [2024-12-02 15:23:17.442799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.881 [2024-12-02 15:23:17.442835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.442942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.881 [2024-12-02 15:23:17.442962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.443098] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.881 [2024-12-02 15:23:17.443120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.443260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.881 [2024-12-02 15:23:17.443287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.881 #36 NEW cov: 11891 ft: 14293 corp: 27/1790b lim: 90 exec/s: 36 rss: 70Mb L: 72/77 MS: 1 InsertByte- 00:07:31.881 [2024-12-02 15:23:17.502310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.881 [2024-12-02 15:23:17.502346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.881 #37 NEW cov: 11891 ft: 15101 corp: 28/1822b lim: 90 exec/s: 37 rss: 70Mb L: 32/77 MS: 1 CrossOver- 00:07:31.881 [2024-12-02 15:23:17.553276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.881 [2024-12-02 15:23:17.553311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.553444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.881 [2024-12-02 15:23:17.553472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.553616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.881 [2024-12-02 15:23:17.553637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.553761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:31.881 [2024-12-02 15:23:17.553783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.881 #38 NEW cov: 11891 ft: 15114 corp: 29/1901b lim: 90 exec/s: 38 rss: 70Mb L: 79/79 MS: 1 InsertRepeatedBytes- 00:07:31.881 [2024-12-02 15:23:17.603021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.881 [2024-12-02 15:23:17.603056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.603185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:31.881 [2024-12-02 15:23:17.603204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.881 [2024-12-02 15:23:17.603339] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:31.881 [2024-12-02 15:23:17.603360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.881 #39 NEW cov: 11891 ft: 15154 corp: 30/1972b lim: 90 exec/s: 39 rss: 70Mb L: 71/79 MS: 1 ShuffleBytes- 00:07:31.881 [2024-12-02 15:23:17.652728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:31.881 [2024-12-02 15:23:17.652770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.881 #40 NEW cov: 11891 ft: 15195 corp: 31/2004b lim: 90 exec/s: 40 rss: 70Mb L: 32/79 MS: 1 ShuffleBytes- 00:07:32.141 [2024-12-02 15:23:17.713013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:32.141 [2024-12-02 15:23:17.713040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.141 #41 NEW cov: 11891 ft: 15204 corp: 32/2036b lim: 90 exec/s: 41 rss: 70Mb L: 32/79 MS: 1 CopyPart- 00:07:32.141 [2024-12-02 15:23:17.773723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:32.141 [2024-12-02 15:23:17.773760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.141 [2024-12-02 15:23:17.773888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:32.141 [2024-12-02 15:23:17.773909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.141 [2024-12-02 15:23:17.774049] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:32.141 [2024-12-02 15:23:17.774077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.141 #42 NEW cov: 11891 ft: 15218 corp: 33/2095b lim: 90 exec/s: 21 rss: 70Mb L: 59/79 MS: 1 CopyPart- 00:07:32.141 #42 DONE cov: 11891 ft: 15218 corp: 33/2095b lim: 90 exec/s: 21 rss: 70Mb 00:07:32.141 ###### Recommended dictionary. ###### 00:07:32.141 "\001\000\000\000\000\000\000\000" # Uses: 2 00:07:32.141 "\024\347\305\033\303K\226\000" # Uses: 1 00:07:32.141 ###### End of recommended dictionary. ###### 00:07:32.141 Done 42 runs in 2 second(s) 00:07:32.141 15:23:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:07:32.141 15:23:17 -- ../common.sh@72 -- # (( i++ )) 00:07:32.141 15:23:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.141 15:23:17 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:32.141 15:23:17 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:32.141 15:23:17 -- nvmf/run.sh@24 -- # local timen=1 00:07:32.141 15:23:17 -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.141 15:23:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:32.141 15:23:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:32.141 15:23:17 -- nvmf/run.sh@29 -- # printf %02d 21 00:07:32.141 15:23:17 -- nvmf/run.sh@29 -- # port=4421 00:07:32.141 15:23:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:32.141 15:23:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:32.141 15:23:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.141 15:23:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:07:32.401 [2024-12-02 15:23:17.961762] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:32.401 [2024-12-02 15:23:17.961853] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1604990 ] 00:07:32.401 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.401 [2024-12-02 15:23:18.139907] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.401 [2024-12-02 15:23:18.202036] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.401 [2024-12-02 15:23:18.202155] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.661 [2024-12-02 15:23:18.260235] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.661 [2024-12-02 15:23:18.276549] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:32.661 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.661 INFO: Seed: 4091133358 00:07:32.661 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:32.661 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:32.661 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:32.661 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.661 #2 INITED exec/s: 0 rss: 59Mb 00:07:32.661 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.661 This may also happen if the target rejected all inputs we tried so far 00:07:32.661 [2024-12-02 15:23:18.325638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.661 [2024-12-02 15:23:18.325668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.661 [2024-12-02 15:23:18.325705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.661 [2024-12-02 15:23:18.325720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.661 [2024-12-02 15:23:18.325782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.661 [2024-12-02 15:23:18.325798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.661 [2024-12-02 15:23:18.325850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.661 [2024-12-02 15:23:18.325866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.661 [2024-12-02 15:23:18.325920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:32.661 [2024-12-02 15:23:18.325937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:32.921 NEW_FUNC[1/671]: 0x45fc08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:32.921 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:32.921 #4 NEW cov: 11634 ft: 11640 corp: 2/51b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:32.921 [2024-12-02 15:23:18.626401] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.921 [2024-12-02 15:23:18.626438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.626500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.921 [2024-12-02 15:23:18.626517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.626575] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.921 [2024-12-02 15:23:18.626593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.626652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.921 [2024-12-02 15:23:18.626670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.626729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:32.921 [2024-12-02 15:23:18.626751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:32.921 NEW_FUNC[1/1]: 0x1c310f8 in accel_comp_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/accel/accel_sw.c:554 00:07:32.921 #5 NEW cov: 11752 ft: 12123 corp: 3/101b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CopyPart- 00:07:32.921 [2024-12-02 15:23:18.676479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.921 [2024-12-02 15:23:18.676508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.676556] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.921 [2024-12-02 15:23:18.676572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.676629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.921 [2024-12-02 15:23:18.676643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.676699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.921 [2024-12-02 15:23:18.676718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.676775] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:32.921 [2024-12-02 15:23:18.676791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:32.921 #8 NEW cov: 11758 ft: 12383 corp: 4/151b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:32.921 [2024-12-02 15:23:18.716635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:32.921 [2024-12-02 15:23:18.716663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.716711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:32.921 [2024-12-02 15:23:18.716727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.716785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:32.921 [2024-12-02 15:23:18.716799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.716854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:32.921 [2024-12-02 15:23:18.716868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.921 [2024-12-02 15:23:18.716923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:32.921 [2024-12-02 15:23:18.716937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.181 #9 NEW cov: 11843 ft: 12681 corp: 5/201b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:33.181 [2024-12-02 15:23:18.766747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.181 [2024-12-02 15:23:18.766776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.766825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.181 [2024-12-02 15:23:18.766840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.766896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.181 [2024-12-02 15:23:18.766912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.766967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.181 [2024-12-02 15:23:18.766982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.767038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.181 [2024-12-02 15:23:18.767054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.181 #10 NEW cov: 11843 ft: 12943 corp: 6/251b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:07:33.181 [2024-12-02 15:23:18.806854] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.181 [2024-12-02 15:23:18.806884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.806926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.181 [2024-12-02 15:23:18.806946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.807001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.181 [2024-12-02 15:23:18.807016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.807072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.181 [2024-12-02 15:23:18.807088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.807145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.181 [2024-12-02 15:23:18.807161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.181 #16 NEW cov: 11843 ft: 13079 corp: 7/301b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeByte- 00:07:33.181 [2024-12-02 15:23:18.846996] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.181 [2024-12-02 15:23:18.847025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.847070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.181 [2024-12-02 15:23:18.847085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.847140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.181 [2024-12-02 15:23:18.847156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.847214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.181 [2024-12-02 15:23:18.847228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.181 [2024-12-02 15:23:18.847282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.181 [2024-12-02 15:23:18.847298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.181 #17 NEW cov: 11843 ft: 13127 corp: 8/351b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.181 [2024-12-02 15:23:18.886448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.181 [2024-12-02 15:23:18.886477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.181 #27 NEW cov: 11843 ft: 14130 corp: 9/367b lim: 50 exec/s: 0 rss: 67Mb L: 16/50 MS: 5 CopyPart-InsertByte-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:33.182 [2024-12-02 15:23:18.927183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.182 [2024-12-02 15:23:18.927211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.927257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.182 [2024-12-02 15:23:18.927273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.927328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.182 [2024-12-02 15:23:18.927346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.927405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.182 [2024-12-02 15:23:18.927421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.927478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.182 [2024-12-02 15:23:18.927495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.182 #28 NEW cov: 11843 ft: 14146 corp: 10/417b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 CrossOver- 00:07:33.182 [2024-12-02 15:23:18.967314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.182 [2024-12-02 15:23:18.967342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.967391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.182 [2024-12-02 15:23:18.967408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.967463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.182 [2024-12-02 15:23:18.967478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.967532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.182 [2024-12-02 15:23:18.967547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.182 [2024-12-02 15:23:18.967602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.182 [2024-12-02 15:23:18.967616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.182 #39 NEW cov: 11843 ft: 14199 corp: 11/467b lim: 50 exec/s: 0 rss: 67Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.441 [2024-12-02 15:23:19.007437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.007464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.007514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.007531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.007587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.007603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.007658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.007674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.007730] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.007751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.442 #40 NEW cov: 11843 ft: 14236 corp: 12/517b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.442 [2024-12-02 15:23:19.047512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.047540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.047581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.047597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.047651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.047666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.047722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.047743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.047798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.047814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.442 #41 NEW cov: 11843 ft: 14333 corp: 13/567b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:33.442 [2024-12-02 15:23:19.087646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.087674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.087723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.087744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.087799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.087815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.087869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.087885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.087942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.087958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.442 #42 NEW cov: 11843 ft: 14348 corp: 14/617b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:07:33.442 [2024-12-02 15:23:19.127742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.127770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.127822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.127837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.127895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.127911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.127968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.127983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.128044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.128060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.442 #43 NEW cov: 11843 ft: 14358 corp: 15/667b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:33.442 [2024-12-02 15:23:19.167899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.167927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.167977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.167993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.168048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.168063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.168120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.168134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.168190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.168206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.442 #44 NEW cov: 11843 ft: 14391 corp: 16/717b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.442 [2024-12-02 15:23:19.208007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.208035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.208084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.208101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.208158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.208174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.208231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.208246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.208304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.208319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.442 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:33.442 #45 NEW cov: 11866 ft: 14429 corp: 17/767b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 CrossOver- 00:07:33.442 [2024-12-02 15:23:19.248120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.442 [2024-12-02 15:23:19.248148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.248207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.442 [2024-12-02 15:23:19.248227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.248284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.442 [2024-12-02 15:23:19.248300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.248356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.442 [2024-12-02 15:23:19.248371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.442 [2024-12-02 15:23:19.248428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.442 [2024-12-02 15:23:19.248444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.702 #51 NEW cov: 11866 ft: 14457 corp: 18/817b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 CopyPart- 00:07:33.702 [2024-12-02 15:23:19.287951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.702 [2024-12-02 15:23:19.287980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.702 [2024-12-02 15:23:19.288032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.702 [2024-12-02 15:23:19.288048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.702 [2024-12-02 15:23:19.288104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.702 [2024-12-02 15:23:19.288119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.702 #52 NEW cov: 11866 ft: 14777 corp: 19/853b lim: 50 exec/s: 52 rss: 68Mb L: 36/50 MS: 1 EraseBytes- 00:07:33.702 [2024-12-02 15:23:19.328385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.702 [2024-12-02 15:23:19.328413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.702 [2024-12-02 15:23:19.328455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.702 [2024-12-02 15:23:19.328470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.702 [2024-12-02 15:23:19.328527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.702 [2024-12-02 15:23:19.328543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.702 [2024-12-02 15:23:19.328599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.702 [2024-12-02 15:23:19.328615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.702 [2024-12-02 15:23:19.328672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.702 [2024-12-02 15:23:19.328688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.702 #53 NEW cov: 11866 ft: 14789 corp: 20/903b lim: 50 exec/s: 53 rss: 68Mb L: 50/50 MS: 1 CrossOver- 00:07:33.702 [2024-12-02 15:23:19.367850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.703 [2024-12-02 15:23:19.367877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.703 #59 NEW cov: 11866 ft: 14827 corp: 21/919b lim: 50 exec/s: 59 rss: 68Mb L: 16/50 MS: 1 ShuffleBytes- 00:07:33.703 [2024-12-02 15:23:19.408293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.703 [2024-12-02 15:23:19.408321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.408359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.703 [2024-12-02 15:23:19.408374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.408433] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.703 [2024-12-02 15:23:19.408449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.703 #60 NEW cov: 11866 ft: 14877 corp: 22/955b lim: 50 exec/s: 60 rss: 68Mb L: 36/50 MS: 1 ChangeBinInt- 00:07:33.703 [2024-12-02 15:23:19.448726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.703 [2024-12-02 15:23:19.448756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.448824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.703 [2024-12-02 15:23:19.448840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.448896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.703 [2024-12-02 15:23:19.448912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.448969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.703 [2024-12-02 15:23:19.448985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.449042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.703 [2024-12-02 15:23:19.449060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.703 #61 NEW cov: 11866 ft: 14900 corp: 23/1005b lim: 50 exec/s: 61 rss: 68Mb L: 50/50 MS: 1 ChangeByte- 00:07:33.703 [2024-12-02 15:23:19.488843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.703 [2024-12-02 15:23:19.488870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.488920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.703 [2024-12-02 15:23:19.488936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.488993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.703 [2024-12-02 15:23:19.489008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.489065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.703 [2024-12-02 15:23:19.489079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.703 [2024-12-02 15:23:19.489136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.703 [2024-12-02 15:23:19.489153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.703 #62 NEW cov: 11866 ft: 14904 corp: 24/1055b lim: 50 exec/s: 62 rss: 68Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:33.963 [2024-12-02 15:23:19.528802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.528830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.528870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.963 [2024-12-02 15:23:19.528886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.528944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.963 [2024-12-02 15:23:19.528960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.529018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.963 [2024-12-02 15:23:19.529033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.963 #63 NEW cov: 11866 ft: 14978 corp: 25/1097b lim: 50 exec/s: 63 rss: 68Mb L: 42/50 MS: 1 CrossOver- 00:07:33.963 [2024-12-02 15:23:19.569096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.569122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.569170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.963 [2024-12-02 15:23:19.569186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.569243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.963 [2024-12-02 15:23:19.569258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.569313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.963 [2024-12-02 15:23:19.569329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.569386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.963 [2024-12-02 15:23:19.569402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.963 #64 NEW cov: 11866 ft: 15002 corp: 26/1147b lim: 50 exec/s: 64 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.963 [2024-12-02 15:23:19.608593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.608621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.648998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.649026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.649065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.963 [2024-12-02 15:23:19.649082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.649140] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.963 [2024-12-02 15:23:19.649157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.963 #66 NEW cov: 11866 ft: 15017 corp: 27/1183b lim: 50 exec/s: 66 rss: 68Mb L: 36/50 MS: 2 CMP-CrossOver- DE: "\377\377\377\377\377\377\377\000"- 00:07:33.963 [2024-12-02 15:23:19.689436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.689463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.689513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.963 [2024-12-02 15:23:19.689529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.689585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.963 [2024-12-02 15:23:19.689600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.689657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.963 [2024-12-02 15:23:19.689673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.689727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.963 [2024-12-02 15:23:19.689747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.963 #67 NEW cov: 11866 ft: 15072 corp: 28/1233b lim: 50 exec/s: 67 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.963 [2024-12-02 15:23:19.729547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.729574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.729627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.963 [2024-12-02 15:23:19.729643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.729695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.963 [2024-12-02 15:23:19.729711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.729782] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:33.963 [2024-12-02 15:23:19.729799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.729856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:33.963 [2024-12-02 15:23:19.729872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:33.963 #68 NEW cov: 11866 ft: 15080 corp: 29/1283b lim: 50 exec/s: 68 rss: 68Mb L: 50/50 MS: 1 ChangeBinInt- 00:07:33.963 [2024-12-02 15:23:19.769312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:33.963 [2024-12-02 15:23:19.769340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.769377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:33.963 [2024-12-02 15:23:19.769392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.963 [2024-12-02 15:23:19.769446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:33.963 [2024-12-02 15:23:19.769463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.224 #69 NEW cov: 11866 ft: 15088 corp: 30/1319b lim: 50 exec/s: 69 rss: 69Mb L: 36/50 MS: 1 EraseBytes- 00:07:34.224 [2024-12-02 15:23:19.809818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.224 [2024-12-02 15:23:19.809845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.809896] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.224 [2024-12-02 15:23:19.809911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.809965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.224 [2024-12-02 15:23:19.809981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.810035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.224 [2024-12-02 15:23:19.810050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.810105] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.224 [2024-12-02 15:23:19.810120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.224 #70 NEW cov: 11866 ft: 15097 corp: 31/1369b lim: 50 exec/s: 70 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:07:34.224 [2024-12-02 15:23:19.849927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.224 [2024-12-02 15:23:19.849954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.850005] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.224 [2024-12-02 15:23:19.850020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.850076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.224 [2024-12-02 15:23:19.850092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.850147] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.224 [2024-12-02 15:23:19.850174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.850231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.224 [2024-12-02 15:23:19.850248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.224 #71 NEW cov: 11866 ft: 15110 corp: 32/1419b lim: 50 exec/s: 71 rss: 69Mb L: 50/50 MS: 1 CrossOver- 00:07:34.224 [2024-12-02 15:23:19.889987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.224 [2024-12-02 15:23:19.890015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.890070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.224 [2024-12-02 15:23:19.890087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.890143] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.224 [2024-12-02 15:23:19.890162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.890219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.224 [2024-12-02 15:23:19.890236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.890294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.224 [2024-12-02 15:23:19.890311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.224 #72 NEW cov: 11866 ft: 15124 corp: 33/1469b lim: 50 exec/s: 72 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:07:34.224 [2024-12-02 15:23:19.930172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.224 [2024-12-02 15:23:19.930199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.930250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.224 [2024-12-02 15:23:19.930267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.930324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.224 [2024-12-02 15:23:19.930340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.930394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.224 [2024-12-02 15:23:19.930410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.930466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.224 [2024-12-02 15:23:19.930482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.224 #73 NEW cov: 11866 ft: 15137 corp: 34/1519b lim: 50 exec/s: 73 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:34.224 [2024-12-02 15:23:19.970278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.224 [2024-12-02 15:23:19.970306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.970356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.224 [2024-12-02 15:23:19.970372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.970427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.224 [2024-12-02 15:23:19.970443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.970497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.224 [2024-12-02 15:23:19.970511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.224 [2024-12-02 15:23:19.970566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.224 [2024-12-02 15:23:19.970582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.224 #74 NEW cov: 11866 ft: 15142 corp: 35/1569b lim: 50 exec/s: 74 rss: 69Mb L: 50/50 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\000"- 00:07:34.224 [2024-12-02 15:23:20.009744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.224 [2024-12-02 15:23:20.009772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.224 #77 NEW cov: 11866 ft: 15149 corp: 36/1586b lim: 50 exec/s: 77 rss: 69Mb L: 17/50 MS: 3 ChangeByte-CrossOver-PersAutoDict- DE: "\377\377\377\377\377\377\377\000"- 00:07:34.484 [2024-12-02 15:23:20.050466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.484 [2024-12-02 15:23:20.050495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.050545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.484 [2024-12-02 15:23:20.050561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.050618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.484 [2024-12-02 15:23:20.050634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.050691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.484 [2024-12-02 15:23:20.050707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.050762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.484 [2024-12-02 15:23:20.050779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.484 #78 NEW cov: 11866 ft: 15176 corp: 37/1636b lim: 50 exec/s: 78 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:07:34.484 [2024-12-02 15:23:20.090012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.484 [2024-12-02 15:23:20.090040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.484 #79 NEW cov: 11866 ft: 15193 corp: 38/1652b lim: 50 exec/s: 79 rss: 69Mb L: 16/50 MS: 1 EraseBytes- 00:07:34.484 [2024-12-02 15:23:20.140791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.484 [2024-12-02 15:23:20.140820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.140867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.484 [2024-12-02 15:23:20.140882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.140936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.484 [2024-12-02 15:23:20.140952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.141004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.484 [2024-12-02 15:23:20.141018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.141073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.484 [2024-12-02 15:23:20.141091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.484 #80 NEW cov: 11866 ft: 15253 corp: 39/1702b lim: 50 exec/s: 80 rss: 69Mb L: 50/50 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\000"- 00:07:34.484 [2024-12-02 15:23:20.180905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.484 [2024-12-02 15:23:20.180936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.180975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.484 [2024-12-02 15:23:20.180990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.181046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.484 [2024-12-02 15:23:20.181062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.484 [2024-12-02 15:23:20.181115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.485 [2024-12-02 15:23:20.181129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.181185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.485 [2024-12-02 15:23:20.181201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.485 #81 NEW cov: 11866 ft: 15283 corp: 40/1752b lim: 50 exec/s: 81 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:07:34.485 [2024-12-02 15:23:20.220959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.485 [2024-12-02 15:23:20.220987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.221037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.485 [2024-12-02 15:23:20.221053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.221109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.485 [2024-12-02 15:23:20.221125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.221179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.485 [2024-12-02 15:23:20.221195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.221264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.485 [2024-12-02 15:23:20.221279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.485 #82 NEW cov: 11866 ft: 15314 corp: 41/1802b lim: 50 exec/s: 82 rss: 69Mb L: 50/50 MS: 1 ChangeBit- 00:07:34.485 [2024-12-02 15:23:20.261119] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.485 [2024-12-02 15:23:20.261146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.261200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.485 [2024-12-02 15:23:20.261216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.261271] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:34.485 [2024-12-02 15:23:20.261287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.261342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:34.485 [2024-12-02 15:23:20.261361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.261415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:07:34.485 [2024-12-02 15:23:20.261431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:34.485 #83 NEW cov: 11866 ft: 15329 corp: 42/1852b lim: 50 exec/s: 83 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:07:34.485 [2024-12-02 15:23:20.300766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:34.485 [2024-12-02 15:23:20.300793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.485 [2024-12-02 15:23:20.300832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:34.485 [2024-12-02 15:23:20.300848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.744 #84 NEW cov: 11866 ft: 15593 corp: 43/1878b lim: 50 exec/s: 42 rss: 69Mb L: 26/50 MS: 1 EraseBytes- 00:07:34.744 #84 DONE cov: 11866 ft: 15593 corp: 43/1878b lim: 50 exec/s: 42 rss: 69Mb 00:07:34.744 ###### Recommended dictionary. ###### 00:07:34.744 "\377\377\377\377\377\377\377\000" # Uses: 3 00:07:34.744 ###### End of recommended dictionary. ###### 00:07:34.744 Done 84 runs in 2 second(s) 00:07:34.744 15:23:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:07:34.744 15:23:20 -- ../common.sh@72 -- # (( i++ )) 00:07:34.744 15:23:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.744 15:23:20 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:07:34.744 15:23:20 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:07:34.744 15:23:20 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.744 15:23:20 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.744 15:23:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:34.744 15:23:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:07:34.744 15:23:20 -- nvmf/run.sh@29 -- # printf %02d 22 00:07:34.744 15:23:20 -- nvmf/run.sh@29 -- # port=4422 00:07:34.744 15:23:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:34.744 15:23:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:07:34.744 15:23:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.745 15:23:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:07:34.745 [2024-12-02 15:23:20.482959] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:34.745 [2024-12-02 15:23:20.483028] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605343 ] 00:07:34.745 EAL: No free 2048 kB hugepages reported on node 1 00:07:35.004 [2024-12-02 15:23:20.669844] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:35.004 [2024-12-02 15:23:20.738278] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:35.004 [2024-12-02 15:23:20.738405] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.004 [2024-12-02 15:23:20.796861] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.004 [2024-12-02 15:23:20.813184] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:07:35.263 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.263 INFO: Seed: 2334188281 00:07:35.263 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:35.263 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:35.263 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:35.263 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.263 #2 INITED exec/s: 0 rss: 60Mb 00:07:35.263 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.263 This may also happen if the target rejected all inputs we tried so far 00:07:35.263 [2024-12-02 15:23:20.879160] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.263 [2024-12-02 15:23:20.879197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.263 [2024-12-02 15:23:20.879309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.263 [2024-12-02 15:23:20.879332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.523 NEW_FUNC[1/672]: 0x461ed8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:07:35.523 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.523 #10 NEW cov: 11665 ft: 11666 corp: 2/49b lim: 85 exec/s: 0 rss: 68Mb L: 48/48 MS: 3 ChangeByte-ChangeBit-InsertRepeatedBytes- 00:07:35.523 [2024-12-02 15:23:21.200467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.523 [2024-12-02 15:23:21.200521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.200656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.523 [2024-12-02 15:23:21.200684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.200814] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.523 [2024-12-02 15:23:21.200840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.200972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.523 [2024-12-02 15:23:21.201000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.523 #12 NEW cov: 11778 ft: 12769 corp: 3/131b lim: 85 exec/s: 0 rss: 68Mb L: 82/82 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:35.523 [2024-12-02 15:23:21.240463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.523 [2024-12-02 15:23:21.240494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.240587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.523 [2024-12-02 15:23:21.240607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.240716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.523 [2024-12-02 15:23:21.240739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.240865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.523 [2024-12-02 15:23:21.240889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.523 #18 NEW cov: 11784 ft: 12994 corp: 4/213b lim: 85 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 ChangeBinInt- 00:07:35.523 [2024-12-02 15:23:21.280643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.523 [2024-12-02 15:23:21.280673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.280794] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.523 [2024-12-02 15:23:21.280816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.280923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.523 [2024-12-02 15:23:21.280945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.281061] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.523 [2024-12-02 15:23:21.281081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.523 #19 NEW cov: 11869 ft: 13257 corp: 5/295b lim: 85 exec/s: 0 rss: 68Mb L: 82/82 MS: 1 ChangeByte- 00:07:35.523 [2024-12-02 15:23:21.330723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.523 [2024-12-02 15:23:21.330759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.330839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.523 [2024-12-02 15:23:21.330858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.330975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.523 [2024-12-02 15:23:21.330994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.523 [2024-12-02 15:23:21.331108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.523 [2024-12-02 15:23:21.331126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.782 #20 NEW cov: 11869 ft: 13412 corp: 6/379b lim: 85 exec/s: 0 rss: 68Mb L: 84/84 MS: 1 CMP- DE: "\377\000"- 00:07:35.782 [2024-12-02 15:23:21.371181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.782 [2024-12-02 15:23:21.371213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.371298] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.782 [2024-12-02 15:23:21.371317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.371434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.782 [2024-12-02 15:23:21.371455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.371566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.782 [2024-12-02 15:23:21.371582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.371696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:35.782 [2024-12-02 15:23:21.371718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:35.782 #21 NEW cov: 11869 ft: 13504 corp: 7/464b lim: 85 exec/s: 0 rss: 68Mb L: 85/85 MS: 1 CopyPart- 00:07:35.782 [2024-12-02 15:23:21.421018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.782 [2024-12-02 15:23:21.421050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.421151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.782 [2024-12-02 15:23:21.421175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.421288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.782 [2024-12-02 15:23:21.421311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.421430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.782 [2024-12-02 15:23:21.421450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.782 #22 NEW cov: 11869 ft: 13571 corp: 8/546b lim: 85 exec/s: 0 rss: 68Mb L: 82/85 MS: 1 ChangeBit- 00:07:35.782 [2024-12-02 15:23:21.461064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.782 [2024-12-02 15:23:21.461095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.461197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.782 [2024-12-02 15:23:21.461232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.461349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.782 [2024-12-02 15:23:21.461367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.461485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.782 [2024-12-02 15:23:21.461508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.782 #23 NEW cov: 11869 ft: 13618 corp: 9/628b lim: 85 exec/s: 0 rss: 68Mb L: 82/85 MS: 1 ShuffleBytes- 00:07:35.782 [2024-12-02 15:23:21.501278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.782 [2024-12-02 15:23:21.501311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.501402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.782 [2024-12-02 15:23:21.501424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.501538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.782 [2024-12-02 15:23:21.501560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.501682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.782 [2024-12-02 15:23:21.501703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.782 #24 NEW cov: 11869 ft: 13635 corp: 10/710b lim: 85 exec/s: 0 rss: 68Mb L: 82/85 MS: 1 ChangeBinInt- 00:07:35.782 [2024-12-02 15:23:21.540938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.782 [2024-12-02 15:23:21.540967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.541086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.782 [2024-12-02 15:23:21.541107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.782 #25 NEW cov: 11869 ft: 13685 corp: 11/752b lim: 85 exec/s: 0 rss: 68Mb L: 42/85 MS: 1 EraseBytes- 00:07:35.782 [2024-12-02 15:23:21.581535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:35.782 [2024-12-02 15:23:21.581567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.581637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:35.782 [2024-12-02 15:23:21.581658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.581770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:35.782 [2024-12-02 15:23:21.581792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.782 [2024-12-02 15:23:21.581900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:35.782 [2024-12-02 15:23:21.581922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.042 #26 NEW cov: 11869 ft: 13696 corp: 12/834b lim: 85 exec/s: 0 rss: 69Mb L: 82/85 MS: 1 PersAutoDict- DE: "\377\000"- 00:07:36.042 [2024-12-02 15:23:21.621594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.042 [2024-12-02 15:23:21.621622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.621719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.042 [2024-12-02 15:23:21.621739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.621851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.042 [2024-12-02 15:23:21.621869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.621984] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.042 [2024-12-02 15:23:21.622005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.042 #27 NEW cov: 11869 ft: 13740 corp: 13/916b lim: 85 exec/s: 0 rss: 69Mb L: 82/85 MS: 1 CopyPart- 00:07:36.042 [2024-12-02 15:23:21.661763] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.042 [2024-12-02 15:23:21.661793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.661863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.042 [2024-12-02 15:23:21.661885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.661999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.042 [2024-12-02 15:23:21.662020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.662139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.042 [2024-12-02 15:23:21.662162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.042 #28 NEW cov: 11869 ft: 13751 corp: 14/1000b lim: 85 exec/s: 0 rss: 69Mb L: 84/85 MS: 1 ChangeBit- 00:07:36.042 [2024-12-02 15:23:21.701863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.042 [2024-12-02 15:23:21.701893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.701978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.042 [2024-12-02 15:23:21.701999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.702112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.042 [2024-12-02 15:23:21.702135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.702253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.042 [2024-12-02 15:23:21.702273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.042 #29 NEW cov: 11869 ft: 13764 corp: 15/1082b lim: 85 exec/s: 0 rss: 69Mb L: 82/85 MS: 1 ChangeBit- 00:07:36.042 [2024-12-02 15:23:21.741787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.042 [2024-12-02 15:23:21.741819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.741914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.042 [2024-12-02 15:23:21.741952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.742071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.042 [2024-12-02 15:23:21.742093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.042 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:36.042 #30 NEW cov: 11892 ft: 14081 corp: 16/1149b lim: 85 exec/s: 0 rss: 69Mb L: 67/85 MS: 1 CrossOver- 00:07:36.042 [2024-12-02 15:23:21.792193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.042 [2024-12-02 15:23:21.792225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.792307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.042 [2024-12-02 15:23:21.792328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.792439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.042 [2024-12-02 15:23:21.792469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.792586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.042 [2024-12-02 15:23:21.792607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.042 #31 NEW cov: 11892 ft: 14100 corp: 17/1231b lim: 85 exec/s: 0 rss: 69Mb L: 82/85 MS: 1 CopyPart- 00:07:36.042 [2024-12-02 15:23:21.831839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.042 [2024-12-02 15:23:21.831873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.042 [2024-12-02 15:23:21.831987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.042 [2024-12-02 15:23:21.832008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.042 #32 NEW cov: 11892 ft: 14158 corp: 18/1275b lim: 85 exec/s: 32 rss: 69Mb L: 44/85 MS: 1 EraseBytes- 00:07:36.308 [2024-12-02 15:23:21.872418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.308 [2024-12-02 15:23:21.872450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.872539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.308 [2024-12-02 15:23:21.872560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.872677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.308 [2024-12-02 15:23:21.872699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.872820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.308 [2024-12-02 15:23:21.872842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.308 #33 NEW cov: 11892 ft: 14180 corp: 19/1357b lim: 85 exec/s: 33 rss: 69Mb L: 82/85 MS: 1 PersAutoDict- DE: "\377\000"- 00:07:36.308 [2024-12-02 15:23:21.912610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.308 [2024-12-02 15:23:21.912635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.912759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.308 [2024-12-02 15:23:21.912783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.912899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.308 [2024-12-02 15:23:21.912920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.913038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.308 [2024-12-02 15:23:21.913060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.308 #34 NEW cov: 11892 ft: 14223 corp: 20/1441b lim: 85 exec/s: 34 rss: 69Mb L: 84/85 MS: 1 PersAutoDict- DE: "\377\000"- 00:07:36.308 [2024-12-02 15:23:21.952623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.308 [2024-12-02 15:23:21.952657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.952746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.308 [2024-12-02 15:23:21.952768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.952890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.308 [2024-12-02 15:23:21.952914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.953043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.308 [2024-12-02 15:23:21.953065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.308 #35 NEW cov: 11892 ft: 14231 corp: 21/1523b lim: 85 exec/s: 35 rss: 69Mb L: 82/85 MS: 1 ChangeBinInt- 00:07:36.308 [2024-12-02 15:23:21.992530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.308 [2024-12-02 15:23:21.992566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.992681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.308 [2024-12-02 15:23:21.992722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.992852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.308 [2024-12-02 15:23:21.992870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:21.992998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.308 [2024-12-02 15:23:21.993021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.308 #41 NEW cov: 11892 ft: 14250 corp: 22/1605b lim: 85 exec/s: 41 rss: 69Mb L: 82/85 MS: 1 ChangeBinInt- 00:07:36.308 [2024-12-02 15:23:22.042878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.308 [2024-12-02 15:23:22.042909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:22.042993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.308 [2024-12-02 15:23:22.043015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.308 [2024-12-02 15:23:22.043141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.308 [2024-12-02 15:23:22.043162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.043282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.309 [2024-12-02 15:23:22.043306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.309 #42 NEW cov: 11892 ft: 14264 corp: 23/1687b lim: 85 exec/s: 42 rss: 69Mb L: 82/85 MS: 1 CopyPart- 00:07:36.309 [2024-12-02 15:23:22.083047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.309 [2024-12-02 15:23:22.083079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.083166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.309 [2024-12-02 15:23:22.083189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.083309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.309 [2024-12-02 15:23:22.083331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.083455] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.309 [2024-12-02 15:23:22.083474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.309 #43 NEW cov: 11892 ft: 14265 corp: 24/1769b lim: 85 exec/s: 43 rss: 69Mb L: 82/85 MS: 1 ChangeBinInt- 00:07:36.309 [2024-12-02 15:23:22.123201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.309 [2024-12-02 15:23:22.123231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.123363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.309 [2024-12-02 15:23:22.123385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.123504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.309 [2024-12-02 15:23:22.123528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.309 [2024-12-02 15:23:22.123647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.309 [2024-12-02 15:23:22.123668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.573 #44 NEW cov: 11892 ft: 14276 corp: 25/1852b lim: 85 exec/s: 44 rss: 69Mb L: 83/85 MS: 1 CrossOver- 00:07:36.573 [2024-12-02 15:23:22.163039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.573 [2024-12-02 15:23:22.163069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.163164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.573 [2024-12-02 15:23:22.163185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.163305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.573 [2024-12-02 15:23:22.163328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.163443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.573 [2024-12-02 15:23:22.163465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.573 #45 NEW cov: 11892 ft: 14283 corp: 26/1934b lim: 85 exec/s: 45 rss: 69Mb L: 82/85 MS: 1 CMP- DE: "\377\003\000\000\000\000\000\000"- 00:07:36.573 [2024-12-02 15:23:22.213006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.573 [2024-12-02 15:23:22.213037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.213139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.573 [2024-12-02 15:23:22.213163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.213283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.573 [2024-12-02 15:23:22.213304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.213429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.573 [2024-12-02 15:23:22.213450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.573 #46 NEW cov: 11892 ft: 14300 corp: 27/2005b lim: 85 exec/s: 46 rss: 69Mb L: 71/85 MS: 1 CMP- DE: "\376\377\377\377"- 00:07:36.573 [2024-12-02 15:23:22.263358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.573 [2024-12-02 15:23:22.263393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.263523] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.573 [2024-12-02 15:23:22.263545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.263673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.573 [2024-12-02 15:23:22.263691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.263819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.573 [2024-12-02 15:23:22.263843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.573 #47 NEW cov: 11892 ft: 14304 corp: 28/2088b lim: 85 exec/s: 47 rss: 69Mb L: 83/85 MS: 1 InsertByte- 00:07:36.573 [2024-12-02 15:23:22.313719] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.573 [2024-12-02 15:23:22.313756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.313876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.573 [2024-12-02 15:23:22.313898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.314014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.573 [2024-12-02 15:23:22.314039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.314167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.573 [2024-12-02 15:23:22.314188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.573 #48 NEW cov: 11892 ft: 14316 corp: 29/2170b lim: 85 exec/s: 48 rss: 69Mb L: 82/85 MS: 1 ChangeByte- 00:07:36.573 [2024-12-02 15:23:22.363737] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.573 [2024-12-02 15:23:22.363771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.363901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.573 [2024-12-02 15:23:22.363925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.364048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.573 [2024-12-02 15:23:22.364068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.573 [2024-12-02 15:23:22.364196] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.573 [2024-12-02 15:23:22.364214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.573 #49 NEW cov: 11892 ft: 14321 corp: 30/2252b lim: 85 exec/s: 49 rss: 70Mb L: 82/85 MS: 1 ChangeBinInt- 00:07:36.832 [2024-12-02 15:23:22.403679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.832 [2024-12-02 15:23:22.403712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.403865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.832 [2024-12-02 15:23:22.403889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.404010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.832 [2024-12-02 15:23:22.404034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.404165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.832 [2024-12-02 15:23:22.404183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.832 #50 NEW cov: 11892 ft: 14332 corp: 31/2334b lim: 85 exec/s: 50 rss: 70Mb L: 82/85 MS: 1 ChangeBit- 00:07:36.832 [2024-12-02 15:23:22.443700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.832 [2024-12-02 15:23:22.443730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.443834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.832 [2024-12-02 15:23:22.443857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.443980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.832 [2024-12-02 15:23:22.444001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.832 #51 NEW cov: 11892 ft: 14340 corp: 32/2396b lim: 85 exec/s: 51 rss: 70Mb L: 62/85 MS: 1 CopyPart- 00:07:36.832 [2024-12-02 15:23:22.484434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.832 [2024-12-02 15:23:22.484464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.484563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.832 [2024-12-02 15:23:22.484585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.484703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.832 [2024-12-02 15:23:22.484723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.484849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.832 [2024-12-02 15:23:22.484871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.832 [2024-12-02 15:23:22.485013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:36.832 [2024-12-02 15:23:22.485037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:36.833 #52 NEW cov: 11892 ft: 14342 corp: 33/2481b lim: 85 exec/s: 52 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:07:36.833 [2024-12-02 15:23:22.524332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.833 [2024-12-02 15:23:22.524364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.524503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.833 [2024-12-02 15:23:22.524522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.524643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.833 [2024-12-02 15:23:22.524669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.524790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.833 [2024-12-02 15:23:22.524811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.833 #53 NEW cov: 11892 ft: 14372 corp: 34/2563b lim: 85 exec/s: 53 rss: 70Mb L: 82/85 MS: 1 ChangeByte- 00:07:36.833 [2024-12-02 15:23:22.574398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.833 [2024-12-02 15:23:22.574430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.574519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.833 [2024-12-02 15:23:22.574540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.574650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.833 [2024-12-02 15:23:22.574667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.574795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.833 [2024-12-02 15:23:22.574814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.833 #54 NEW cov: 11892 ft: 14405 corp: 35/2642b lim: 85 exec/s: 54 rss: 70Mb L: 79/85 MS: 1 PersAutoDict- DE: "\377\003\000\000\000\000\000\000"- 00:07:36.833 [2024-12-02 15:23:22.614597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:36.833 [2024-12-02 15:23:22.614624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.614695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:36.833 [2024-12-02 15:23:22.614716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.614829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:36.833 [2024-12-02 15:23:22.614851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.833 [2024-12-02 15:23:22.614968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:36.833 [2024-12-02 15:23:22.614984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.833 #55 NEW cov: 11892 ft: 14422 corp: 36/2725b lim: 85 exec/s: 55 rss: 70Mb L: 83/85 MS: 1 InsertByte- 00:07:37.092 [2024-12-02 15:23:22.654778] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:37.092 [2024-12-02 15:23:22.654806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.092 [2024-12-02 15:23:22.654882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:37.092 [2024-12-02 15:23:22.654901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.092 [2024-12-02 15:23:22.655024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:37.092 [2024-12-02 15:23:22.655047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.092 [2024-12-02 15:23:22.655157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:37.092 [2024-12-02 15:23:22.655179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.092 #56 NEW cov: 11892 ft: 14425 corp: 37/2801b lim: 85 exec/s: 56 rss: 70Mb L: 76/85 MS: 1 EraseBytes- 00:07:37.092 [2024-12-02 15:23:22.694097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:37.092 [2024-12-02 15:23:22.694121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.092 #58 NEW cov: 11892 ft: 15228 corp: 38/2829b lim: 85 exec/s: 58 rss: 70Mb L: 28/85 MS: 2 ChangeBit-CrossOver- 00:07:37.092 [2024-12-02 15:23:22.734921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:37.092 [2024-12-02 15:23:22.734953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.092 [2024-12-02 15:23:22.735047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:37.093 [2024-12-02 15:23:22.735068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.735180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:37.093 [2024-12-02 15:23:22.735202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.735323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:37.093 [2024-12-02 15:23:22.735345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.093 #59 NEW cov: 11892 ft: 15261 corp: 39/2911b lim: 85 exec/s: 59 rss: 70Mb L: 82/85 MS: 1 ChangeByte- 00:07:37.093 [2024-12-02 15:23:22.775280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:37.093 [2024-12-02 15:23:22.775311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.775408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:37.093 [2024-12-02 15:23:22.775444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.775558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:37.093 [2024-12-02 15:23:22.775581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.775692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:37.093 [2024-12-02 15:23:22.775713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.775840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:07:37.093 [2024-12-02 15:23:22.775863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:37.093 #60 NEW cov: 11892 ft: 15276 corp: 40/2996b lim: 85 exec/s: 60 rss: 70Mb L: 85/85 MS: 1 CopyPart- 00:07:37.093 [2024-12-02 15:23:22.815193] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:37.093 [2024-12-02 15:23:22.815221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.815319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:37.093 [2024-12-02 15:23:22.815336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.815456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:37.093 [2024-12-02 15:23:22.815477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.815594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:37.093 [2024-12-02 15:23:22.815614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:37.093 #61 NEW cov: 11892 ft: 15342 corp: 41/3073b lim: 85 exec/s: 61 rss: 70Mb L: 77/85 MS: 1 InsertRepeatedBytes- 00:07:37.093 [2024-12-02 15:23:22.854811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:37.093 [2024-12-02 15:23:22.854841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.093 [2024-12-02 15:23:22.854942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:37.093 [2024-12-02 15:23:22.854959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.093 #62 NEW cov: 11892 ft: 15347 corp: 42/3121b lim: 85 exec/s: 31 rss: 70Mb L: 48/85 MS: 1 ChangeBit- 00:07:37.093 #62 DONE cov: 11892 ft: 15347 corp: 42/3121b lim: 85 exec/s: 31 rss: 70Mb 00:07:37.093 ###### Recommended dictionary. ###### 00:07:37.093 "\377\000" # Uses: 4 00:07:37.093 "\377\003\000\000\000\000\000\000" # Uses: 1 00:07:37.093 "\376\377\377\377" # Uses: 0 00:07:37.093 ###### End of recommended dictionary. ###### 00:07:37.093 Done 62 runs in 2 second(s) 00:07:37.353 15:23:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:07:37.353 15:23:22 -- ../common.sh@72 -- # (( i++ )) 00:07:37.353 15:23:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.353 15:23:22 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:07:37.353 15:23:22 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:07:37.353 15:23:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.353 15:23:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.353 15:23:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:37.353 15:23:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:07:37.353 15:23:22 -- nvmf/run.sh@29 -- # printf %02d 23 00:07:37.353 15:23:22 -- nvmf/run.sh@29 -- # port=4423 00:07:37.353 15:23:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:37.353 15:23:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:07:37.353 15:23:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.353 15:23:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:07:37.353 [2024-12-02 15:23:23.029705] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:37.353 [2024-12-02 15:23:23.029779] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605823 ] 00:07:37.353 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.612 [2024-12-02 15:23:23.206823] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.612 [2024-12-02 15:23:23.269116] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.612 [2024-12-02 15:23:23.269236] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.612 [2024-12-02 15:23:23.327006] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.612 [2024-12-02 15:23:23.343276] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:07:37.612 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.612 INFO: Seed: 568207876 00:07:37.612 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:37.612 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:37.612 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:37.612 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.612 #2 INITED exec/s: 0 rss: 61Mb 00:07:37.612 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.612 This may also happen if the target rejected all inputs we tried so far 00:07:37.612 [2024-12-02 15:23:23.412664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:37.612 [2024-12-02 15:23:23.412697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.130 NEW_FUNC[1/671]: 0x465118 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:07:38.130 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:38.130 #9 NEW cov: 11598 ft: 11580 corp: 2/6b lim: 25 exec/s: 0 rss: 68Mb L: 5/5 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:38.130 [2024-12-02 15:23:23.743684] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.130 [2024-12-02 15:23:23.743743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.130 #10 NEW cov: 11711 ft: 12164 corp: 3/11b lim: 25 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 ChangeByte- 00:07:38.130 [2024-12-02 15:23:23.803748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.130 [2024-12-02 15:23:23.803784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.130 #11 NEW cov: 11717 ft: 12410 corp: 4/17b lim: 25 exec/s: 0 rss: 68Mb L: 6/6 MS: 1 InsertByte- 00:07:38.130 [2024-12-02 15:23:23.853887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.130 [2024-12-02 15:23:23.853913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.130 #12 NEW cov: 11802 ft: 12662 corp: 5/22b lim: 25 exec/s: 0 rss: 68Mb L: 5/6 MS: 1 ChangeBit- 00:07:38.130 [2024-12-02 15:23:23.904023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.130 [2024-12-02 15:23:23.904048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.131 #18 NEW cov: 11802 ft: 12758 corp: 6/27b lim: 25 exec/s: 0 rss: 68Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.390 [2024-12-02 15:23:23.954221] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.390 [2024-12-02 15:23:23.954255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.390 #19 NEW cov: 11802 ft: 12869 corp: 7/32b lim: 25 exec/s: 0 rss: 68Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.390 [2024-12-02 15:23:24.004406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.390 [2024-12-02 15:23:24.004435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.390 #20 NEW cov: 11802 ft: 12918 corp: 8/37b lim: 25 exec/s: 0 rss: 68Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.390 [2024-12-02 15:23:24.054479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.390 [2024-12-02 15:23:24.054507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.390 #26 NEW cov: 11802 ft: 12947 corp: 9/43b lim: 25 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 CrossOver- 00:07:38.390 [2024-12-02 15:23:24.104690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.390 [2024-12-02 15:23:24.104717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.390 #27 NEW cov: 11802 ft: 12961 corp: 10/48b lim: 25 exec/s: 0 rss: 69Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.390 [2024-12-02 15:23:24.154827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.390 [2024-12-02 15:23:24.154859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.390 #28 NEW cov: 11802 ft: 13076 corp: 11/53b lim: 25 exec/s: 0 rss: 69Mb L: 5/6 MS: 1 ChangeBit- 00:07:38.390 [2024-12-02 15:23:24.205108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.390 [2024-12-02 15:23:24.205133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.649 #29 NEW cov: 11802 ft: 13139 corp: 12/59b lim: 25 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 CopyPart- 00:07:38.649 [2024-12-02 15:23:24.255187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.649 [2024-12-02 15:23:24.255222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.649 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.649 #30 NEW cov: 11825 ft: 13184 corp: 13/64b lim: 25 exec/s: 0 rss: 69Mb L: 5/6 MS: 1 ChangeBit- 00:07:38.649 [2024-12-02 15:23:24.305323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.649 [2024-12-02 15:23:24.305348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.649 #31 NEW cov: 11825 ft: 13206 corp: 14/69b lim: 25 exec/s: 0 rss: 69Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.649 [2024-12-02 15:23:24.355532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.649 [2024-12-02 15:23:24.355562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.649 #32 NEW cov: 11825 ft: 13267 corp: 15/74b lim: 25 exec/s: 0 rss: 69Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.649 [2024-12-02 15:23:24.405660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.649 [2024-12-02 15:23:24.405691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.649 #33 NEW cov: 11825 ft: 13280 corp: 16/80b lim: 25 exec/s: 33 rss: 69Mb L: 6/6 MS: 1 ChangeBinInt- 00:07:38.649 [2024-12-02 15:23:24.455864] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.649 [2024-12-02 15:23:24.455895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.909 #34 NEW cov: 11825 ft: 13310 corp: 17/85b lim: 25 exec/s: 34 rss: 69Mb L: 5/6 MS: 1 ChangeBinInt- 00:07:38.909 [2024-12-02 15:23:24.496043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.909 [2024-12-02 15:23:24.496074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.909 #35 NEW cov: 11825 ft: 13317 corp: 18/94b lim: 25 exec/s: 35 rss: 69Mb L: 9/9 MS: 1 CopyPart- 00:07:38.909 [2024-12-02 15:23:24.546149] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.909 [2024-12-02 15:23:24.546178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.909 #36 NEW cov: 11825 ft: 13338 corp: 19/101b lim: 25 exec/s: 36 rss: 69Mb L: 7/9 MS: 1 InsertByte- 00:07:38.909 [2024-12-02 15:23:24.596456] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.909 [2024-12-02 15:23:24.596480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.909 #37 NEW cov: 11825 ft: 13347 corp: 20/106b lim: 25 exec/s: 37 rss: 70Mb L: 5/9 MS: 1 ChangeByte- 00:07:38.909 [2024-12-02 15:23:24.646551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.909 [2024-12-02 15:23:24.646580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.909 #39 NEW cov: 11825 ft: 13382 corp: 21/115b lim: 25 exec/s: 39 rss: 70Mb L: 9/9 MS: 2 EraseBytes-CrossOver- 00:07:38.909 [2024-12-02 15:23:24.696628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:38.909 [2024-12-02 15:23:24.696653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.909 #40 NEW cov: 11825 ft: 13391 corp: 22/121b lim: 25 exec/s: 40 rss: 70Mb L: 6/9 MS: 1 ChangeBinInt- 00:07:39.168 [2024-12-02 15:23:24.746853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.168 [2024-12-02 15:23:24.746879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.168 #41 NEW cov: 11825 ft: 13398 corp: 23/126b lim: 25 exec/s: 41 rss: 70Mb L: 5/9 MS: 1 ShuffleBytes- 00:07:39.168 [2024-12-02 15:23:24.796901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.168 [2024-12-02 15:23:24.796925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.168 #42 NEW cov: 11825 ft: 13414 corp: 24/131b lim: 25 exec/s: 42 rss: 70Mb L: 5/9 MS: 1 ChangeByte- 00:07:39.168 [2024-12-02 15:23:24.847124] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.168 [2024-12-02 15:23:24.847158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.168 #43 NEW cov: 11825 ft: 13429 corp: 25/136b lim: 25 exec/s: 43 rss: 70Mb L: 5/9 MS: 1 ShuffleBytes- 00:07:39.168 [2024-12-02 15:23:24.907957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.168 [2024-12-02 15:23:24.907990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.168 [2024-12-02 15:23:24.908122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.168 [2024-12-02 15:23:24.908147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.168 [2024-12-02 15:23:24.908275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:39.168 [2024-12-02 15:23:24.908296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.168 [2024-12-02 15:23:24.908420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:39.168 [2024-12-02 15:23:24.908446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:39.168 #44 NEW cov: 11825 ft: 14076 corp: 26/158b lim: 25 exec/s: 44 rss: 70Mb L: 22/22 MS: 1 InsertRepeatedBytes- 00:07:39.168 [2024-12-02 15:23:24.967531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.168 [2024-12-02 15:23:24.967569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.427 #45 NEW cov: 11825 ft: 14098 corp: 27/163b lim: 25 exec/s: 45 rss: 70Mb L: 5/22 MS: 1 ShuffleBytes- 00:07:39.427 [2024-12-02 15:23:25.017752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.427 [2024-12-02 15:23:25.017777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.427 #46 NEW cov: 11825 ft: 14105 corp: 28/168b lim: 25 exec/s: 46 rss: 70Mb L: 5/22 MS: 1 ChangeBinInt- 00:07:39.427 [2024-12-02 15:23:25.067851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.427 [2024-12-02 15:23:25.067876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.427 #47 NEW cov: 11825 ft: 14111 corp: 29/177b lim: 25 exec/s: 47 rss: 70Mb L: 9/22 MS: 1 CopyPart- 00:07:39.427 [2024-12-02 15:23:25.118370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.427 [2024-12-02 15:23:25.118403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.427 [2024-12-02 15:23:25.118535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:39.427 [2024-12-02 15:23:25.118563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.427 [2024-12-02 15:23:25.118693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:39.427 [2024-12-02 15:23:25.118717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.427 #48 NEW cov: 11825 ft: 14372 corp: 30/195b lim: 25 exec/s: 48 rss: 70Mb L: 18/22 MS: 1 InsertRepeatedBytes- 00:07:39.427 [2024-12-02 15:23:25.168306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.427 [2024-12-02 15:23:25.168331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.427 #49 NEW cov: 11825 ft: 14388 corp: 31/204b lim: 25 exec/s: 49 rss: 70Mb L: 9/22 MS: 1 CrossOver- 00:07:39.427 [2024-12-02 15:23:25.218318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.427 [2024-12-02 15:23:25.218346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.427 #50 NEW cov: 11825 ft: 14396 corp: 32/209b lim: 25 exec/s: 50 rss: 70Mb L: 5/22 MS: 1 ShuffleBytes- 00:07:39.685 [2024-12-02 15:23:25.268579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.685 [2024-12-02 15:23:25.268606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.685 #52 NEW cov: 11825 ft: 14428 corp: 33/216b lim: 25 exec/s: 52 rss: 70Mb L: 7/22 MS: 2 EraseBytes-CrossOver- 00:07:39.685 [2024-12-02 15:23:25.318667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.685 [2024-12-02 15:23:25.318697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.685 #53 NEW cov: 11825 ft: 14446 corp: 34/223b lim: 25 exec/s: 53 rss: 70Mb L: 7/22 MS: 1 ChangeBinInt- 00:07:39.685 [2024-12-02 15:23:25.368839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.685 [2024-12-02 15:23:25.368869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.685 [2024-12-02 15:23:25.408999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:39.685 [2024-12-02 15:23:25.409026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.685 #56 NEW cov: 11825 ft: 14454 corp: 35/232b lim: 25 exec/s: 28 rss: 70Mb L: 9/22 MS: 3 ChangeBit-EraseBytes-CrossOver- 00:07:39.685 #56 DONE cov: 11825 ft: 14454 corp: 35/232b lim: 25 exec/s: 28 rss: 70Mb 00:07:39.685 Done 56 runs in 2 second(s) 00:07:39.943 15:23:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:07:39.943 15:23:25 -- ../common.sh@72 -- # (( i++ )) 00:07:39.943 15:23:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.943 15:23:25 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:07:39.944 15:23:25 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:07:39.944 15:23:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.944 15:23:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.944 15:23:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:39.944 15:23:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:07:39.944 15:23:25 -- nvmf/run.sh@29 -- # printf %02d 24 00:07:39.944 15:23:25 -- nvmf/run.sh@29 -- # port=4424 00:07:39.944 15:23:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:39.944 15:23:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:07:39.944 15:23:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.944 15:23:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:07:39.944 [2024-12-02 15:23:25.581499] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:39.944 [2024-12-02 15:23:25.581570] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606364 ] 00:07:39.944 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.944 [2024-12-02 15:23:25.757138] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.202 [2024-12-02 15:23:25.820592] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.202 [2024-12-02 15:23:25.820712] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.202 [2024-12-02 15:23:25.878295] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.202 [2024-12-02 15:23:25.894583] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:07:40.202 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.202 INFO: Seed: 3121221534 00:07:40.202 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:40.202 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:40.202 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:40.202 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.203 #2 INITED exec/s: 0 rss: 61Mb 00:07:40.203 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.203 This may also happen if the target rejected all inputs we tried so far 00:07:40.203 [2024-12-02 15:23:25.950411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.203 [2024-12-02 15:23:25.950442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.203 [2024-12-02 15:23:25.950486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.203 [2024-12-02 15:23:25.950502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.203 [2024-12-02 15:23:25.950564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.203 [2024-12-02 15:23:25.950579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.203 [2024-12-02 15:23:25.950635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.203 [2024-12-02 15:23:25.950651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.203 [2024-12-02 15:23:25.950710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.203 [2024-12-02 15:23:25.950726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.462 NEW_FUNC[1/672]: 0x466208 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:07:40.462 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.462 #9 NEW cov: 11668 ft: 11669 corp: 2/101b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 2 CMP-InsertRepeatedBytes- DE: "\000\005"- 00:07:40.462 [2024-12-02 15:23:26.271174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.462 [2024-12-02 15:23:26.271207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.462 [2024-12-02 15:23:26.271245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.462 [2024-12-02 15:23:26.271261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.462 [2024-12-02 15:23:26.271316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.462 [2024-12-02 15:23:26.271333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.462 [2024-12-02 15:23:26.271386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.462 [2024-12-02 15:23:26.271401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.462 [2024-12-02 15:23:26.271433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.462 [2024-12-02 15:23:26.271448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.722 #10 NEW cov: 11783 ft: 12075 corp: 3/201b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 ChangeByte- 00:07:40.722 [2024-12-02 15:23:26.321252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.321281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.321330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.321346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.321403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.321419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.321475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.321489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.321544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.321560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.722 #16 NEW cov: 11789 ft: 12476 corp: 4/301b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:07:40.722 [2024-12-02 15:23:26.361371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.361400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.361439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.361455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.361510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.361526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.361581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.361596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.361649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.361665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.722 #17 NEW cov: 11874 ft: 12728 corp: 5/401b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 PersAutoDict- DE: "\000\005"- 00:07:40.722 [2024-12-02 15:23:26.401502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.401531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.401582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.401599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.401655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.401671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.401727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.401751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.401809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.401824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.722 #18 NEW cov: 11874 ft: 12886 corp: 6/501b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:40.722 [2024-12-02 15:23:26.441576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.441604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.441652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.441668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.441723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.441744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.722 [2024-12-02 15:23:26.441798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.722 [2024-12-02 15:23:26.441814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.441868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.441886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.723 #19 NEW cov: 11874 ft: 13009 corp: 7/601b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:40.723 [2024-12-02 15:23:26.481692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:556223529316 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.481720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.481782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.481797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.481853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.481870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.481925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.481943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.481999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.482018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.723 #20 NEW cov: 11874 ft: 13136 corp: 8/701b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:07:40.723 [2024-12-02 15:23:26.521777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890883 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.521806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.521863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.521879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.521934] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.521948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.522002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.522018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.723 [2024-12-02 15:23:26.522073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.723 [2024-12-02 15:23:26.522089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.723 #21 NEW cov: 11874 ft: 13286 corp: 9/801b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:07:40.983 [2024-12-02 15:23:26.561922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331881883795030467 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.561949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.562000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.562016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.562071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.562086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.562140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.562156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.562214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.562229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.983 #22 NEW cov: 11874 ft: 13415 corp: 10/901b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:07:40.983 [2024-12-02 15:23:26.602052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.602084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.602122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.602139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.602194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.602209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.602267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.602282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.602336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.602352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.983 #23 NEW cov: 11874 ft: 13428 corp: 11/1001b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:40.983 [2024-12-02 15:23:26.642166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.642195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.642245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.642260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.642315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.642331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.642386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.642401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.642457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.642473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.983 #24 NEW cov: 11874 ft: 13505 corp: 12/1101b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:07:40.983 [2024-12-02 15:23:26.682321] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.682349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.682399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.682417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.682472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:142393223479808 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.682487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.682544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.682558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.983 [2024-12-02 15:23:26.682613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.682630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.983 #25 NEW cov: 11874 ft: 13535 corp: 13/1201b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 CMP- DE: "\002\000\000\000"- 00:07:40.983 [2024-12-02 15:23:26.722459] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.983 [2024-12-02 15:23:26.722487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.722536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.722553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.722607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.722622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.722676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.722692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.722746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.722764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.984 #26 NEW cov: 11874 ft: 13555 corp: 14/1301b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:07:40.984 [2024-12-02 15:23:26.762602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.762631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.762675] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.762692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.762747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.762767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.762824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.762840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:40.984 [2024-12-02 15:23:26.762899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:40.984 [2024-12-02 15:23:26.762914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:40.984 #27 NEW cov: 11874 ft: 13569 corp: 15/1401b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeBit- 00:07:41.245 [2024-12-02 15:23:26.802530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331881883795030467 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.802559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.802603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.802619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.802661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.802676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.802739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.802756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.245 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.245 #28 NEW cov: 11897 ft: 13651 corp: 16/1493b lim: 100 exec/s: 0 rss: 69Mb L: 92/100 MS: 1 EraseBytes- 00:07:41.245 [2024-12-02 15:23:26.852859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.852888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.852939] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.852955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.853010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.853027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.853084] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.853098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.853154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.853174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.245 #29 NEW cov: 11897 ft: 13661 corp: 17/1593b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:07:41.245 [2024-12-02 15:23:26.892978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890883 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.893007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.893052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.893068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.893124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:32383 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.893139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.893194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.893209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.893265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.893282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.245 #30 NEW cov: 11897 ft: 13719 corp: 18/1693b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:07:41.245 [2024-12-02 15:23:26.933073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.933100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.933154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.933170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.933227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.933243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.933298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.933313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.933369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331776742995624321 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.933385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.245 #31 NEW cov: 11897 ft: 13732 corp: 19/1793b lim: 100 exec/s: 31 rss: 69Mb L: 100/100 MS: 1 ChangeByte- 00:07:41.245 [2024-12-02 15:23:26.973184] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.973212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.245 [2024-12-02 15:23:26.973263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.245 [2024-12-02 15:23:26.973279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:26.973336] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:26.973352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:26.973408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:26.973424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:26.973479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:26.973495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.246 #32 NEW cov: 11897 ft: 13742 corp: 20/1893b lim: 100 exec/s: 32 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:07:41.246 [2024-12-02 15:23:27.013130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.013157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.013207] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.013223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.013277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.013293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.013350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.013366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.246 #33 NEW cov: 11897 ft: 13768 corp: 21/1974b lim: 100 exec/s: 33 rss: 69Mb L: 81/100 MS: 1 EraseBytes- 00:07:41.246 [2024-12-02 15:23:27.053408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.053436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.053488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.053504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.053560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:142393223479808 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.053579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.053636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.053652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.246 [2024-12-02 15:23:27.053708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.246 [2024-12-02 15:23:27.053725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.505 #34 NEW cov: 11897 ft: 13778 corp: 22/2074b lim: 100 exec/s: 34 rss: 69Mb L: 100/100 MS: 1 CrossOver- 00:07:41.505 [2024-12-02 15:23:27.093397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.505 [2024-12-02 15:23:27.093425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.505 [2024-12-02 15:23:27.093474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331942769251418497 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.505 [2024-12-02 15:23:27.093490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.093544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.093561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.093617] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.093633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.506 #35 NEW cov: 11897 ft: 13810 corp: 23/2155b lim: 100 exec/s: 35 rss: 70Mb L: 81/100 MS: 1 ChangeByte- 00:07:41.506 [2024-12-02 15:23:27.133642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:142393223512420 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.133671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.133720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.133738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.133796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.133812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.133866] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.133882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.133938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.133957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.506 #36 NEW cov: 11897 ft: 13826 corp: 24/2255b lim: 100 exec/s: 36 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:07:41.506 [2024-12-02 15:23:27.173826] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.173854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.173905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.173920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.173977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.173994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.174046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.174061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.174119] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.174135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.506 #37 NEW cov: 11897 ft: 13887 corp: 25/2355b lim: 100 exec/s: 37 rss: 70Mb L: 100/100 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:07:41.506 [2024-12-02 15:23:27.213739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331881883795030467 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.213766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.213818] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.213834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.213889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.213905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.213960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.213974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.506 #38 NEW cov: 11897 ft: 13920 corp: 26/2451b lim: 100 exec/s: 38 rss: 70Mb L: 96/100 MS: 1 CopyPart- 00:07:41.506 [2024-12-02 15:23:27.254019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.254047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.254098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.254117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.254172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:396740635408826624 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.254188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.254243] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.254258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.254315] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.254330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.506 #39 NEW cov: 11897 ft: 13940 corp: 27/2551b lim: 100 exec/s: 39 rss: 70Mb L: 100/100 MS: 1 CopyPart- 00:07:41.506 [2024-12-02 15:23:27.294160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.294189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.294241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.294257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.294311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.294328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.294384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.294400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.506 [2024-12-02 15:23:27.294455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.506 [2024-12-02 15:23:27.294470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.506 #40 NEW cov: 11897 ft: 13948 corp: 28/2651b lim: 100 exec/s: 40 rss: 70Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:41.767 [2024-12-02 15:23:27.334255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.334283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.334333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:142393223479808 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.334348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.334403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.334421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.334474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.334488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.334544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.334561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.767 #41 NEW cov: 11897 ft: 13984 corp: 29/2751b lim: 100 exec/s: 41 rss: 70Mb L: 100/100 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:07:41.767 [2024-12-02 15:23:27.374385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.374413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.374465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882293964407169 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.374482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.374537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:142393223479808 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.374553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.374606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.374622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.374679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.374696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.767 #42 NEW cov: 11897 ft: 13997 corp: 30/2851b lim: 100 exec/s: 42 rss: 70Mb L: 100/100 MS: 1 ChangeBit- 00:07:41.767 [2024-12-02 15:23:27.414493] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.414522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.414574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.414590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.414645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.414662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.414715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.414738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.414796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33233 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.414812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.767 #43 NEW cov: 11897 ft: 14004 corp: 31/2951b lim: 100 exec/s: 43 rss: 70Mb L: 100/100 MS: 1 ChangeByte- 00:07:41.767 [2024-12-02 15:23:27.454605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890883 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.454632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.454686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.454701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.454757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.454772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.454827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.454843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.454899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.454916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:41.767 #44 NEW cov: 11897 ft: 14012 corp: 32/3051b lim: 100 exec/s: 44 rss: 70Mb L: 100/100 MS: 1 CopyPart- 00:07:41.767 [2024-12-02 15:23:27.494559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.494587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.494628] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331881750651044225 len:130 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.494645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.494700] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.494717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.494775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.494791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.767 #45 NEW cov: 11897 ft: 14054 corp: 33/3136b lim: 100 exec/s: 45 rss: 70Mb L: 85/100 MS: 1 EraseBytes- 00:07:41.767 [2024-12-02 15:23:27.534349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.534380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.534432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.534448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 #46 NEW cov: 11897 ft: 14548 corp: 34/3194b lim: 100 exec/s: 46 rss: 70Mb L: 58/100 MS: 1 EraseBytes- 00:07:41.767 [2024-12-02 15:23:27.574987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:556223529316 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.575016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.575067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.767 [2024-12-02 15:23:27.575084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:41.767 [2024-12-02 15:23:27.575141] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.768 [2024-12-02 15:23:27.575158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:41.768 [2024-12-02 15:23:27.575214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.768 [2024-12-02 15:23:27.575229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:41.768 [2024-12-02 15:23:27.575287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331751454228185473 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:41.768 [2024-12-02 15:23:27.575303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:42.027 #47 NEW cov: 11897 ft: 14566 corp: 35/3294b lim: 100 exec/s: 47 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:07:42.027 [2024-12-02 15:23:27.614568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.614597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.027 [2024-12-02 15:23:27.614647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.614663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.027 #48 NEW cov: 11897 ft: 14575 corp: 36/3346b lim: 100 exec/s: 48 rss: 70Mb L: 52/100 MS: 1 EraseBytes- 00:07:42.027 [2024-12-02 15:23:27.654880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.654908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.027 [2024-12-02 15:23:27.654947] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.654962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.027 [2024-12-02 15:23:27.655021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882294031516033 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.655037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.027 #49 NEW cov: 11897 ft: 14860 corp: 37/3423b lim: 100 exec/s: 49 rss: 70Mb L: 77/100 MS: 1 CopyPart- 00:07:42.027 [2024-12-02 15:23:27.695147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.695174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.027 [2024-12-02 15:23:27.695213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111869057 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.027 [2024-12-02 15:23:27.695229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.695285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.695302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.695358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.695373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.028 #50 NEW cov: 11897 ft: 14896 corp: 38/3505b lim: 100 exec/s: 50 rss: 70Mb L: 82/100 MS: 1 InsertByte- 00:07:42.028 [2024-12-02 15:23:27.735446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.735473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.735529] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.735545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.735599] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.735614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.735669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.735684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.735741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331776742995624321 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.735757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:42.028 #51 NEW cov: 11897 ft: 14907 corp: 39/3605b lim: 100 exec/s: 51 rss: 70Mb L: 100/100 MS: 1 CopyPart- 00:07:42.028 [2024-12-02 15:23:27.775364] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.775391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.775434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.775449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.775505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.775519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.775577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.775592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.028 #52 NEW cov: 11897 ft: 14909 corp: 40/3691b lim: 100 exec/s: 52 rss: 70Mb L: 86/100 MS: 1 EraseBytes- 00:07:42.028 [2024-12-02 15:23:27.815635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.815662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.815716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.815736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.815793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.815808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.815864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.815880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.028 [2024-12-02 15:23:27.815936] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.028 [2024-12-02 15:23:27.815952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:42.028 #53 NEW cov: 11897 ft: 14940 corp: 41/3791b lim: 100 exec/s: 53 rss: 70Mb L: 100/100 MS: 1 ShuffleBytes- 00:07:42.288 [2024-12-02 15:23:27.855614] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882296111890817 len:33025 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.855641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.855679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.855694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.855753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:142393223479808 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.855770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.855828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.855843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.288 #54 NEW cov: 11897 ft: 14943 corp: 42/3878b lim: 100 exec/s: 54 rss: 70Mb L: 87/100 MS: 1 EraseBytes- 00:07:42.288 [2024-12-02 15:23:27.895913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331882294971040129 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.895942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.895988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.896005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.896058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:396740635408796160 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.896073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.896129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.896144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.896202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.896218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:42.288 #55 NEW cov: 11897 ft: 14955 corp: 43/3978b lim: 100 exec/s: 55 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:07:42.288 [2024-12-02 15:23:27.935697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:9331881883795030467 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.935725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.935770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.935786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:42.288 [2024-12-02 15:23:27.935842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:9331882296111890817 len:33154 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:42.288 [2024-12-02 15:23:27.935859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:42.288 #56 NEW cov: 11897 ft: 14972 corp: 44/4047b lim: 100 exec/s: 28 rss: 70Mb L: 69/100 MS: 1 EraseBytes- 00:07:42.288 #56 DONE cov: 11897 ft: 14972 corp: 44/4047b lim: 100 exec/s: 28 rss: 70Mb 00:07:42.288 ###### Recommended dictionary. ###### 00:07:42.288 "\000\005" # Uses: 1 00:07:42.288 "\002\000\000\000" # Uses: 2 00:07:42.288 ###### End of recommended dictionary. ###### 00:07:42.288 Done 56 runs in 2 second(s) 00:07:42.288 15:23:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:07:42.288 15:23:28 -- ../common.sh@72 -- # (( i++ )) 00:07:42.288 15:23:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.288 15:23:28 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:07:42.288 00:07:42.288 real 1m5.281s 00:07:42.288 user 1m40.900s 00:07:42.288 sys 0m7.896s 00:07:42.288 15:23:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:42.288 15:23:28 -- common/autotest_common.sh@10 -- # set +x 00:07:42.288 ************************************ 00:07:42.288 END TEST nvmf_fuzz 00:07:42.288 ************************************ 00:07:42.549 15:23:28 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:42.549 15:23:28 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:42.549 15:23:28 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:42.549 15:23:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:42.549 15:23:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:42.549 15:23:28 -- common/autotest_common.sh@10 -- # set +x 00:07:42.549 ************************************ 00:07:42.549 START TEST vfio_fuzz 00:07:42.549 ************************************ 00:07:42.549 15:23:28 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:42.549 * Looking for test storage... 00:07:42.549 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.549 15:23:28 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:42.549 15:23:28 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:42.549 15:23:28 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:42.549 15:23:28 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:42.549 15:23:28 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:42.549 15:23:28 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:42.549 15:23:28 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:42.549 15:23:28 -- scripts/common.sh@335 -- # IFS=.-: 00:07:42.549 15:23:28 -- scripts/common.sh@335 -- # read -ra ver1 00:07:42.549 15:23:28 -- scripts/common.sh@336 -- # IFS=.-: 00:07:42.549 15:23:28 -- scripts/common.sh@336 -- # read -ra ver2 00:07:42.549 15:23:28 -- scripts/common.sh@337 -- # local 'op=<' 00:07:42.549 15:23:28 -- scripts/common.sh@339 -- # ver1_l=2 00:07:42.549 15:23:28 -- scripts/common.sh@340 -- # ver2_l=1 00:07:42.549 15:23:28 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:42.549 15:23:28 -- scripts/common.sh@343 -- # case "$op" in 00:07:42.549 15:23:28 -- scripts/common.sh@344 -- # : 1 00:07:42.549 15:23:28 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:42.549 15:23:28 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:42.549 15:23:28 -- scripts/common.sh@364 -- # decimal 1 00:07:42.549 15:23:28 -- scripts/common.sh@352 -- # local d=1 00:07:42.549 15:23:28 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:42.549 15:23:28 -- scripts/common.sh@354 -- # echo 1 00:07:42.549 15:23:28 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:42.549 15:23:28 -- scripts/common.sh@365 -- # decimal 2 00:07:42.549 15:23:28 -- scripts/common.sh@352 -- # local d=2 00:07:42.549 15:23:28 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:42.549 15:23:28 -- scripts/common.sh@354 -- # echo 2 00:07:42.549 15:23:28 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:42.549 15:23:28 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:42.549 15:23:28 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:42.549 15:23:28 -- scripts/common.sh@367 -- # return 0 00:07:42.549 15:23:28 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:42.549 15:23:28 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:42.549 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.549 --rc genhtml_branch_coverage=1 00:07:42.549 --rc genhtml_function_coverage=1 00:07:42.549 --rc genhtml_legend=1 00:07:42.549 --rc geninfo_all_blocks=1 00:07:42.549 --rc geninfo_unexecuted_blocks=1 00:07:42.549 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.549 ' 00:07:42.549 15:23:28 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:42.549 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.549 --rc genhtml_branch_coverage=1 00:07:42.549 --rc genhtml_function_coverage=1 00:07:42.549 --rc genhtml_legend=1 00:07:42.549 --rc geninfo_all_blocks=1 00:07:42.549 --rc geninfo_unexecuted_blocks=1 00:07:42.549 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.549 ' 00:07:42.550 15:23:28 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:42.550 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.550 --rc genhtml_branch_coverage=1 00:07:42.550 --rc genhtml_function_coverage=1 00:07:42.550 --rc genhtml_legend=1 00:07:42.550 --rc geninfo_all_blocks=1 00:07:42.550 --rc geninfo_unexecuted_blocks=1 00:07:42.550 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.550 ' 00:07:42.550 15:23:28 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:42.550 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.550 --rc genhtml_branch_coverage=1 00:07:42.550 --rc genhtml_function_coverage=1 00:07:42.550 --rc genhtml_legend=1 00:07:42.550 --rc geninfo_all_blocks=1 00:07:42.550 --rc geninfo_unexecuted_blocks=1 00:07:42.550 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.550 ' 00:07:42.550 15:23:28 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:42.550 15:23:28 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:42.550 15:23:28 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:42.550 15:23:28 -- common/autotest_common.sh@34 -- # set -e 00:07:42.550 15:23:28 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:42.550 15:23:28 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:42.550 15:23:28 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:42.550 15:23:28 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:42.550 15:23:28 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:42.550 15:23:28 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:42.550 15:23:28 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:42.550 15:23:28 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:42.550 15:23:28 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:42.550 15:23:28 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:42.550 15:23:28 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:42.550 15:23:28 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:42.550 15:23:28 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:42.550 15:23:28 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:42.550 15:23:28 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:42.550 15:23:28 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:42.550 15:23:28 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:42.550 15:23:28 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:42.550 15:23:28 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:42.550 15:23:28 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:42.550 15:23:28 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:42.550 15:23:28 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:42.550 15:23:28 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:42.550 15:23:28 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:42.550 15:23:28 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:42.550 15:23:28 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:42.550 15:23:28 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:42.550 15:23:28 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:42.550 15:23:28 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:42.550 15:23:28 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:42.550 15:23:28 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:42.550 15:23:28 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:42.550 15:23:28 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:42.550 15:23:28 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:42.550 15:23:28 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:42.550 15:23:28 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:42.550 15:23:28 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:42.550 15:23:28 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:42.550 15:23:28 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:42.550 15:23:28 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:42.550 15:23:28 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:42.550 15:23:28 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:42.550 15:23:28 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:42.550 15:23:28 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:42.550 15:23:28 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:42.550 15:23:28 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:42.550 15:23:28 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:42.550 15:23:28 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:42.550 15:23:28 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:42.550 15:23:28 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:42.550 15:23:28 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:42.550 15:23:28 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:42.550 15:23:28 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:42.550 15:23:28 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:42.550 15:23:28 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:42.550 15:23:28 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:42.550 15:23:28 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:42.550 15:23:28 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:42.550 15:23:28 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:42.550 15:23:28 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:42.550 15:23:28 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:42.550 15:23:28 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:42.550 15:23:28 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:42.550 15:23:28 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:42.550 15:23:28 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:42.550 15:23:28 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:42.550 15:23:28 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:42.550 15:23:28 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:42.550 15:23:28 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:42.550 15:23:28 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:42.550 15:23:28 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:42.550 15:23:28 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:42.550 15:23:28 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:42.550 15:23:28 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:42.550 15:23:28 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:42.550 15:23:28 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:42.550 15:23:28 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:42.550 15:23:28 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:42.550 15:23:28 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:42.550 15:23:28 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:42.550 15:23:28 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:42.550 15:23:28 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:42.550 15:23:28 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:42.550 15:23:28 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:42.550 15:23:28 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:42.550 15:23:28 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:42.550 15:23:28 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:42.550 15:23:28 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:42.550 15:23:28 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:42.550 15:23:28 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:42.550 15:23:28 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:42.550 15:23:28 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:42.550 15:23:28 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:42.550 #define SPDK_CONFIG_H 00:07:42.550 #define SPDK_CONFIG_APPS 1 00:07:42.550 #define SPDK_CONFIG_ARCH native 00:07:42.550 #undef SPDK_CONFIG_ASAN 00:07:42.550 #undef SPDK_CONFIG_AVAHI 00:07:42.550 #undef SPDK_CONFIG_CET 00:07:42.550 #define SPDK_CONFIG_COVERAGE 1 00:07:42.550 #define SPDK_CONFIG_CROSS_PREFIX 00:07:42.550 #undef SPDK_CONFIG_CRYPTO 00:07:42.550 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:42.550 #undef SPDK_CONFIG_CUSTOMOCF 00:07:42.550 #undef SPDK_CONFIG_DAOS 00:07:42.550 #define SPDK_CONFIG_DAOS_DIR 00:07:42.550 #define SPDK_CONFIG_DEBUG 1 00:07:42.550 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:42.550 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:42.550 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:42.550 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:42.550 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:42.550 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:42.550 #define SPDK_CONFIG_EXAMPLES 1 00:07:42.550 #undef SPDK_CONFIG_FC 00:07:42.550 #define SPDK_CONFIG_FC_PATH 00:07:42.550 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:42.550 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:42.550 #undef SPDK_CONFIG_FUSE 00:07:42.550 #define SPDK_CONFIG_FUZZER 1 00:07:42.550 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:42.550 #undef SPDK_CONFIG_GOLANG 00:07:42.550 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:42.550 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:42.550 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:42.550 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:42.551 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:42.551 #define SPDK_CONFIG_IDXD 1 00:07:42.551 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:42.551 #undef SPDK_CONFIG_IPSEC_MB 00:07:42.551 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:42.551 #define SPDK_CONFIG_ISAL 1 00:07:42.551 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:42.551 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:42.551 #define SPDK_CONFIG_LIBDIR 00:07:42.551 #undef SPDK_CONFIG_LTO 00:07:42.551 #define SPDK_CONFIG_MAX_LCORES 00:07:42.551 #define SPDK_CONFIG_NVME_CUSE 1 00:07:42.551 #undef SPDK_CONFIG_OCF 00:07:42.551 #define SPDK_CONFIG_OCF_PATH 00:07:42.551 #define SPDK_CONFIG_OPENSSL_PATH 00:07:42.551 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:42.551 #undef SPDK_CONFIG_PGO_USE 00:07:42.551 #define SPDK_CONFIG_PREFIX /usr/local 00:07:42.551 #undef SPDK_CONFIG_RAID5F 00:07:42.551 #undef SPDK_CONFIG_RBD 00:07:42.551 #define SPDK_CONFIG_RDMA 1 00:07:42.551 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:42.551 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:42.551 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:42.551 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:42.551 #undef SPDK_CONFIG_SHARED 00:07:42.551 #undef SPDK_CONFIG_SMA 00:07:42.551 #define SPDK_CONFIG_TESTS 1 00:07:42.551 #undef SPDK_CONFIG_TSAN 00:07:42.551 #define SPDK_CONFIG_UBLK 1 00:07:42.551 #define SPDK_CONFIG_UBSAN 1 00:07:42.551 #undef SPDK_CONFIG_UNIT_TESTS 00:07:42.551 #undef SPDK_CONFIG_URING 00:07:42.551 #define SPDK_CONFIG_URING_PATH 00:07:42.551 #undef SPDK_CONFIG_URING_ZNS 00:07:42.551 #undef SPDK_CONFIG_USDT 00:07:42.551 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:42.551 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:42.551 #define SPDK_CONFIG_VFIO_USER 1 00:07:42.551 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:42.551 #define SPDK_CONFIG_VHOST 1 00:07:42.551 #define SPDK_CONFIG_VIRTIO 1 00:07:42.551 #undef SPDK_CONFIG_VTUNE 00:07:42.551 #define SPDK_CONFIG_VTUNE_DIR 00:07:42.551 #define SPDK_CONFIG_WERROR 1 00:07:42.551 #define SPDK_CONFIG_WPDK_DIR 00:07:42.551 #undef SPDK_CONFIG_XNVME 00:07:42.551 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:42.551 15:23:28 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:42.551 15:23:28 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:42.551 15:23:28 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:42.551 15:23:28 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:42.551 15:23:28 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:42.551 15:23:28 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.551 15:23:28 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.551 15:23:28 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.551 15:23:28 -- paths/export.sh@5 -- # export PATH 00:07:42.551 15:23:28 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:42.551 15:23:28 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:42.551 15:23:28 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:42.551 15:23:28 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:42.551 15:23:28 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:42.551 15:23:28 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:42.551 15:23:28 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:42.551 15:23:28 -- pm/common@16 -- # TEST_TAG=N/A 00:07:42.551 15:23:28 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:42.551 15:23:28 -- common/autotest_common.sh@52 -- # : 1 00:07:42.551 15:23:28 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:42.551 15:23:28 -- common/autotest_common.sh@56 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:42.551 15:23:28 -- common/autotest_common.sh@58 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:42.551 15:23:28 -- common/autotest_common.sh@60 -- # : 1 00:07:42.551 15:23:28 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:42.551 15:23:28 -- common/autotest_common.sh@62 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:42.551 15:23:28 -- common/autotest_common.sh@64 -- # : 00:07:42.551 15:23:28 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:42.551 15:23:28 -- common/autotest_common.sh@66 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:42.551 15:23:28 -- common/autotest_common.sh@68 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:42.551 15:23:28 -- common/autotest_common.sh@70 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:42.551 15:23:28 -- common/autotest_common.sh@72 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:42.551 15:23:28 -- common/autotest_common.sh@74 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:42.551 15:23:28 -- common/autotest_common.sh@76 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:42.551 15:23:28 -- common/autotest_common.sh@78 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:42.551 15:23:28 -- common/autotest_common.sh@80 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:42.551 15:23:28 -- common/autotest_common.sh@82 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:42.551 15:23:28 -- common/autotest_common.sh@84 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:42.551 15:23:28 -- common/autotest_common.sh@86 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:42.551 15:23:28 -- common/autotest_common.sh@88 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:42.551 15:23:28 -- common/autotest_common.sh@90 -- # : 0 00:07:42.551 15:23:28 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:42.551 15:23:28 -- common/autotest_common.sh@92 -- # : 1 00:07:42.813 15:23:28 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:42.813 15:23:28 -- common/autotest_common.sh@94 -- # : 1 00:07:42.813 15:23:28 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:42.813 15:23:28 -- common/autotest_common.sh@96 -- # : rdma 00:07:42.813 15:23:28 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:42.813 15:23:28 -- common/autotest_common.sh@98 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:42.813 15:23:28 -- common/autotest_common.sh@100 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:42.813 15:23:28 -- common/autotest_common.sh@102 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:42.813 15:23:28 -- common/autotest_common.sh@104 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:42.813 15:23:28 -- common/autotest_common.sh@106 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:42.813 15:23:28 -- common/autotest_common.sh@108 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:42.813 15:23:28 -- common/autotest_common.sh@110 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:42.813 15:23:28 -- common/autotest_common.sh@112 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:42.813 15:23:28 -- common/autotest_common.sh@114 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:42.813 15:23:28 -- common/autotest_common.sh@116 -- # : 1 00:07:42.813 15:23:28 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:42.813 15:23:28 -- common/autotest_common.sh@118 -- # : 00:07:42.813 15:23:28 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:42.813 15:23:28 -- common/autotest_common.sh@120 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:42.813 15:23:28 -- common/autotest_common.sh@122 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:42.813 15:23:28 -- common/autotest_common.sh@124 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:42.813 15:23:28 -- common/autotest_common.sh@126 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:42.813 15:23:28 -- common/autotest_common.sh@128 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:42.813 15:23:28 -- common/autotest_common.sh@130 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:42.813 15:23:28 -- common/autotest_common.sh@132 -- # : 00:07:42.813 15:23:28 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:42.813 15:23:28 -- common/autotest_common.sh@134 -- # : true 00:07:42.813 15:23:28 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:42.813 15:23:28 -- common/autotest_common.sh@136 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:42.813 15:23:28 -- common/autotest_common.sh@138 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:42.813 15:23:28 -- common/autotest_common.sh@140 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:42.813 15:23:28 -- common/autotest_common.sh@142 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:42.813 15:23:28 -- common/autotest_common.sh@144 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:42.813 15:23:28 -- common/autotest_common.sh@146 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:42.813 15:23:28 -- common/autotest_common.sh@148 -- # : 00:07:42.813 15:23:28 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:42.813 15:23:28 -- common/autotest_common.sh@150 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:42.813 15:23:28 -- common/autotest_common.sh@152 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:42.813 15:23:28 -- common/autotest_common.sh@154 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:42.813 15:23:28 -- common/autotest_common.sh@156 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:42.813 15:23:28 -- common/autotest_common.sh@158 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:42.813 15:23:28 -- common/autotest_common.sh@160 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:42.813 15:23:28 -- common/autotest_common.sh@163 -- # : 00:07:42.813 15:23:28 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:42.813 15:23:28 -- common/autotest_common.sh@165 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:42.813 15:23:28 -- common/autotest_common.sh@167 -- # : 0 00:07:42.813 15:23:28 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:42.813 15:23:28 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:42.813 15:23:28 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:42.813 15:23:28 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:42.813 15:23:28 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:42.813 15:23:28 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:42.813 15:23:28 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:42.813 15:23:28 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:42.813 15:23:28 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:42.813 15:23:28 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:42.813 15:23:28 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:42.813 15:23:28 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:42.813 15:23:28 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:42.813 15:23:28 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:42.813 15:23:28 -- common/autotest_common.sh@196 -- # cat 00:07:42.813 15:23:28 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:42.813 15:23:28 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:42.813 15:23:28 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:42.814 15:23:28 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:42.814 15:23:28 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:42.814 15:23:28 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:42.814 15:23:28 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:42.814 15:23:28 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:42.814 15:23:28 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:42.814 15:23:28 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:42.814 15:23:28 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:42.814 15:23:28 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:42.814 15:23:28 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:42.814 15:23:28 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:42.814 15:23:28 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:42.814 15:23:28 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:42.814 15:23:28 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:42.814 15:23:28 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:42.814 15:23:28 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:42.814 15:23:28 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:42.814 15:23:28 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:42.814 15:23:28 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:42.814 15:23:28 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:42.814 15:23:28 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:42.814 15:23:28 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:42.814 15:23:28 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:42.814 15:23:28 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:42.814 15:23:28 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:42.814 15:23:28 -- common/autotest_common.sh@259 -- # valgrind= 00:07:42.814 15:23:28 -- common/autotest_common.sh@265 -- # uname -s 00:07:42.814 15:23:28 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:42.814 15:23:28 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:42.814 15:23:28 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:42.814 15:23:28 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:42.814 15:23:28 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:42.814 15:23:28 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:42.814 15:23:28 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:42.814 15:23:28 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:42.814 15:23:28 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:42.814 15:23:28 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:42.814 15:23:28 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:42.814 15:23:28 -- common/autotest_common.sh@319 -- # [[ -z 1606819 ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@319 -- # kill -0 1606819 00:07:42.814 15:23:28 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:42.814 15:23:28 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:42.814 15:23:28 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:42.814 15:23:28 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:42.814 15:23:28 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:42.814 15:23:28 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:42.814 15:23:28 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:42.814 15:23:28 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.5GUJVJ 00:07:42.814 15:23:28 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:42.814 15:23:28 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.5GUJVJ/tests/vfio /tmp/spdk.5GUJVJ 00:07:42.814 15:23:28 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@328 -- # df -T 00:07:42.814 15:23:28 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=54433497088 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730574336 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=7297077248 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864027648 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865285120 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=1257472 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864945152 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=344064 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:42.814 15:23:28 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:42.814 15:23:28 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:42.814 15:23:28 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:42.814 15:23:28 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:42.814 * Looking for test storage... 00:07:42.814 15:23:28 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:42.814 15:23:28 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:42.814 15:23:28 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.814 15:23:28 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:42.814 15:23:28 -- common/autotest_common.sh@373 -- # mount=/ 00:07:42.814 15:23:28 -- common/autotest_common.sh@375 -- # target_space=54433497088 00:07:42.814 15:23:28 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:42.814 15:23:28 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:42.814 15:23:28 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:42.814 15:23:28 -- common/autotest_common.sh@382 -- # new_size=9511669760 00:07:42.814 15:23:28 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:42.814 15:23:28 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.814 15:23:28 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.814 15:23:28 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.814 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:42.814 15:23:28 -- common/autotest_common.sh@390 -- # return 0 00:07:42.814 15:23:28 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:42.814 15:23:28 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:42.814 15:23:28 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:42.814 15:23:28 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:42.814 15:23:28 -- common/autotest_common.sh@1682 -- # true 00:07:42.815 15:23:28 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:42.815 15:23:28 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:42.815 15:23:28 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:42.815 15:23:28 -- common/autotest_common.sh@27 -- # exec 00:07:42.815 15:23:28 -- common/autotest_common.sh@29 -- # exec 00:07:42.815 15:23:28 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:42.815 15:23:28 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:42.815 15:23:28 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:42.815 15:23:28 -- common/autotest_common.sh@18 -- # set -x 00:07:42.815 15:23:28 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:42.815 15:23:28 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:42.815 15:23:28 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:42.815 15:23:28 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:42.815 15:23:28 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:42.815 15:23:28 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:42.815 15:23:28 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:42.815 15:23:28 -- scripts/common.sh@335 -- # IFS=.-: 00:07:42.815 15:23:28 -- scripts/common.sh@335 -- # read -ra ver1 00:07:42.815 15:23:28 -- scripts/common.sh@336 -- # IFS=.-: 00:07:42.815 15:23:28 -- scripts/common.sh@336 -- # read -ra ver2 00:07:42.815 15:23:28 -- scripts/common.sh@337 -- # local 'op=<' 00:07:42.815 15:23:28 -- scripts/common.sh@339 -- # ver1_l=2 00:07:42.815 15:23:28 -- scripts/common.sh@340 -- # ver2_l=1 00:07:42.815 15:23:28 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:42.815 15:23:28 -- scripts/common.sh@343 -- # case "$op" in 00:07:42.815 15:23:28 -- scripts/common.sh@344 -- # : 1 00:07:42.815 15:23:28 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:42.815 15:23:28 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:42.815 15:23:28 -- scripts/common.sh@364 -- # decimal 1 00:07:42.815 15:23:28 -- scripts/common.sh@352 -- # local d=1 00:07:42.815 15:23:28 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:42.815 15:23:28 -- scripts/common.sh@354 -- # echo 1 00:07:42.815 15:23:28 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:42.815 15:23:28 -- scripts/common.sh@365 -- # decimal 2 00:07:42.815 15:23:28 -- scripts/common.sh@352 -- # local d=2 00:07:42.815 15:23:28 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:42.815 15:23:28 -- scripts/common.sh@354 -- # echo 2 00:07:42.815 15:23:28 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:42.815 15:23:28 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:42.815 15:23:28 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:42.815 15:23:28 -- scripts/common.sh@367 -- # return 0 00:07:42.815 15:23:28 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:42.815 15:23:28 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:42.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.815 --rc genhtml_branch_coverage=1 00:07:42.815 --rc genhtml_function_coverage=1 00:07:42.815 --rc genhtml_legend=1 00:07:42.815 --rc geninfo_all_blocks=1 00:07:42.815 --rc geninfo_unexecuted_blocks=1 00:07:42.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.815 ' 00:07:42.815 15:23:28 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:42.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.815 --rc genhtml_branch_coverage=1 00:07:42.815 --rc genhtml_function_coverage=1 00:07:42.815 --rc genhtml_legend=1 00:07:42.815 --rc geninfo_all_blocks=1 00:07:42.815 --rc geninfo_unexecuted_blocks=1 00:07:42.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.815 ' 00:07:42.815 15:23:28 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:42.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.815 --rc genhtml_branch_coverage=1 00:07:42.815 --rc genhtml_function_coverage=1 00:07:42.815 --rc genhtml_legend=1 00:07:42.815 --rc geninfo_all_blocks=1 00:07:42.815 --rc geninfo_unexecuted_blocks=1 00:07:42.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.815 ' 00:07:42.815 15:23:28 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:42.815 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:42.815 --rc genhtml_branch_coverage=1 00:07:42.815 --rc genhtml_function_coverage=1 00:07:42.815 --rc genhtml_legend=1 00:07:42.815 --rc geninfo_all_blocks=1 00:07:42.815 --rc geninfo_unexecuted_blocks=1 00:07:42.815 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:42.815 ' 00:07:42.815 15:23:28 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:07:42.815 15:23:28 -- ../common.sh@8 -- # pids=() 00:07:42.815 15:23:28 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:42.815 15:23:28 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:42.815 15:23:28 -- vfio/run.sh@59 -- # fuzz_num=7 00:07:42.815 15:23:28 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:07:42.815 15:23:28 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:07:42.815 15:23:28 -- vfio/run.sh@65 -- # mem_size=0 00:07:42.815 15:23:28 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:07:42.815 15:23:28 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:07:42.815 15:23:28 -- ../common.sh@69 -- # local fuzz_num=7 00:07:42.815 15:23:28 -- ../common.sh@70 -- # local time=1 00:07:42.815 15:23:28 -- ../common.sh@72 -- # (( i = 0 )) 00:07:42.815 15:23:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.815 15:23:28 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:42.815 15:23:28 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:07:42.815 15:23:28 -- vfio/run.sh@23 -- # local timen=1 00:07:42.815 15:23:28 -- vfio/run.sh@24 -- # local core=0x1 00:07:42.815 15:23:28 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:42.815 15:23:28 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:07:42.815 15:23:28 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:07:42.815 15:23:28 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:07:42.815 15:23:28 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:07:42.815 15:23:28 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:42.815 15:23:28 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:07:42.815 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:42.815 15:23:28 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:07:42.815 [2024-12-02 15:23:28.578907] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:42.815 [2024-12-02 15:23:28.579003] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606992 ] 00:07:42.815 EAL: No free 2048 kB hugepages reported on node 1 00:07:43.075 [2024-12-02 15:23:28.652279] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.075 [2024-12-02 15:23:28.720116] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:43.075 [2024-12-02 15:23:28.720246] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.334 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.334 INFO: Seed: 1814251493 00:07:43.334 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:43.334 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:43.334 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:43.334 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.334 #2 INITED exec/s: 0 rss: 61Mb 00:07:43.334 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.335 This may also happen if the target rejected all inputs we tried so far 00:07:43.595 NEW_FUNC[1/631]: 0x43a218 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:07:43.595 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:43.595 #17 NEW cov: 10762 ft: 10723 corp: 2/8b lim: 60 exec/s: 0 rss: 67Mb L: 7/7 MS: 5 InsertByte-CopyPart-CrossOver-CrossOver-InsertRepeatedBytes- 00:07:43.854 #24 NEW cov: 10779 ft: 13618 corp: 3/47b lim: 60 exec/s: 0 rss: 68Mb L: 39/39 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:44.113 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:44.113 #25 NEW cov: 10796 ft: 14430 corp: 4/53b lim: 60 exec/s: 0 rss: 69Mb L: 6/39 MS: 1 EraseBytes- 00:07:44.372 #26 NEW cov: 10796 ft: 15355 corp: 5/61b lim: 60 exec/s: 26 rss: 69Mb L: 8/39 MS: 1 InsertByte- 00:07:44.372 #27 NEW cov: 10796 ft: 16443 corp: 6/116b lim: 60 exec/s: 27 rss: 69Mb L: 55/55 MS: 1 CopyPart- 00:07:44.632 #28 NEW cov: 10796 ft: 16503 corp: 7/171b lim: 60 exec/s: 28 rss: 69Mb L: 55/55 MS: 1 ShuffleBytes- 00:07:44.891 #29 NEW cov: 10796 ft: 16585 corp: 8/227b lim: 60 exec/s: 29 rss: 69Mb L: 56/56 MS: 1 InsertByte- 00:07:45.150 #30 NEW cov: 10803 ft: 16868 corp: 9/235b lim: 60 exec/s: 30 rss: 69Mb L: 8/56 MS: 1 ChangeBinInt- 00:07:45.150 #35 NEW cov: 10803 ft: 16985 corp: 10/259b lim: 60 exec/s: 17 rss: 69Mb L: 24/56 MS: 5 EraseBytes-EraseBytes-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:45.150 #35 DONE cov: 10803 ft: 16985 corp: 10/259b lim: 60 exec/s: 17 rss: 69Mb 00:07:45.150 Done 35 runs in 2 second(s) 00:07:45.410 15:23:31 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:07:45.410 15:23:31 -- ../common.sh@72 -- # (( i++ )) 00:07:45.410 15:23:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.410 15:23:31 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:45.410 15:23:31 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:07:45.670 15:23:31 -- vfio/run.sh@23 -- # local timen=1 00:07:45.670 15:23:31 -- vfio/run.sh@24 -- # local core=0x1 00:07:45.670 15:23:31 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:45.670 15:23:31 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:07:45.670 15:23:31 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:07:45.670 15:23:31 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:07:45.670 15:23:31 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:07:45.670 15:23:31 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:45.670 15:23:31 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:07:45.670 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:45.670 15:23:31 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:07:45.670 [2024-12-02 15:23:31.263210] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:45.670 [2024-12-02 15:23:31.263277] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607397 ] 00:07:45.670 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.670 [2024-12-02 15:23:31.336115] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.670 [2024-12-02 15:23:31.409091] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.670 [2024-12-02 15:23:31.409222] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.929 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.929 INFO: Seed: 214275319 00:07:45.929 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:45.929 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:45.929 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:45.929 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.929 #2 INITED exec/s: 0 rss: 62Mb 00:07:45.929 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.929 This may also happen if the target rejected all inputs we tried so far 00:07:45.929 [2024-12-02 15:23:31.735775] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:45.929 [2024-12-02 15:23:31.735810] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:45.929 [2024-12-02 15:23:31.735830] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.446 NEW_FUNC[1/636]: 0x43a7b8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:07:46.446 NEW_FUNC[2/636]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:46.446 #15 NEW cov: 10776 ft: 10698 corp: 2/6b lim: 40 exec/s: 0 rss: 68Mb L: 5/5 MS: 3 CopyPart-CrossOver-InsertRepeatedBytes- 00:07:46.446 [2024-12-02 15:23:32.229411] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.446 [2024-12-02 15:23:32.229444] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.446 [2024-12-02 15:23:32.229462] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.705 NEW_FUNC[1/1]: 0x162e8a8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:07:46.705 #16 NEW cov: 10792 ft: 13144 corp: 3/11b lim: 40 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 ChangeBinInt- 00:07:46.705 [2024-12-02 15:23:32.421668] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.705 [2024-12-02 15:23:32.421691] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.705 [2024-12-02 15:23:32.421708] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.964 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.964 #17 NEW cov: 10809 ft: 14338 corp: 4/15b lim: 40 exec/s: 0 rss: 70Mb L: 4/5 MS: 1 EraseBytes- 00:07:46.964 [2024-12-02 15:23:32.619147] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:46.964 [2024-12-02 15:23:32.619170] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:46.964 [2024-12-02 15:23:32.619187] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:46.964 #19 NEW cov: 10809 ft: 14796 corp: 5/20b lim: 40 exec/s: 19 rss: 70Mb L: 5/5 MS: 2 ShuffleBytes-CMP- DE: "\377\377\377\377"- 00:07:47.222 [2024-12-02 15:23:32.826485] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.222 [2024-12-02 15:23:32.826509] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.222 [2024-12-02 15:23:32.826527] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.222 #20 NEW cov: 10809 ft: 15039 corp: 6/26b lim: 40 exec/s: 20 rss: 70Mb L: 6/6 MS: 1 CMP- DE: "\001\011"- 00:07:47.222 [2024-12-02 15:23:33.017762] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.222 [2024-12-02 15:23:33.017786] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.222 [2024-12-02 15:23:33.017803] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.481 #21 NEW cov: 10809 ft: 15156 corp: 7/35b lim: 40 exec/s: 21 rss: 70Mb L: 9/9 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:07:47.481 [2024-12-02 15:23:33.209957] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.481 [2024-12-02 15:23:33.209980] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.481 [2024-12-02 15:23:33.210001] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.740 #22 NEW cov: 10809 ft: 15461 corp: 8/44b lim: 40 exec/s: 22 rss: 70Mb L: 9/9 MS: 1 CrossOver- 00:07:47.740 [2024-12-02 15:23:33.407816] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:47.740 [2024-12-02 15:23:33.407839] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:47.740 [2024-12-02 15:23:33.407856] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:47.740 #23 NEW cov: 10816 ft: 15703 corp: 9/49b lim: 40 exec/s: 23 rss: 70Mb L: 5/9 MS: 1 InsertByte- 00:07:48.000 [2024-12-02 15:23:33.605428] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:48.000 [2024-12-02 15:23:33.605450] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:48.000 [2024-12-02 15:23:33.605467] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:48.000 #24 NEW cov: 10816 ft: 16006 corp: 10/55b lim: 40 exec/s: 12 rss: 70Mb L: 6/9 MS: 1 InsertByte- 00:07:48.000 #24 DONE cov: 10816 ft: 16006 corp: 10/55b lim: 40 exec/s: 12 rss: 70Mb 00:07:48.000 ###### Recommended dictionary. ###### 00:07:48.000 "\377\377\377\377" # Uses: 1 00:07:48.000 "\001\011" # Uses: 0 00:07:48.000 ###### End of recommended dictionary. ###### 00:07:48.000 Done 24 runs in 2 second(s) 00:07:48.259 15:23:33 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:07:48.259 15:23:33 -- ../common.sh@72 -- # (( i++ )) 00:07:48.259 15:23:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.259 15:23:33 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:48.259 15:23:33 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:07:48.259 15:23:33 -- vfio/run.sh@23 -- # local timen=1 00:07:48.259 15:23:33 -- vfio/run.sh@24 -- # local core=0x1 00:07:48.259 15:23:33 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:48.259 15:23:33 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:07:48.259 15:23:33 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:07:48.259 15:23:33 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:07:48.259 15:23:33 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:07:48.259 15:23:33 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:48.259 15:23:33 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:07:48.259 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:48.259 15:23:33 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:07:48.259 [2024-12-02 15:23:34.024288] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:48.259 [2024-12-02 15:23:34.024374] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607833 ] 00:07:48.259 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.518 [2024-12-02 15:23:34.096033] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.518 [2024-12-02 15:23:34.164941] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.518 [2024-12-02 15:23:34.165070] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.777 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.777 INFO: Seed: 2975299215 00:07:48.777 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:48.777 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:48.777 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:48.777 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.777 #2 INITED exec/s: 0 rss: 62Mb 00:07:48.777 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.777 This may also happen if the target rejected all inputs we tried so far 00:07:48.777 [2024-12-02 15:23:34.467810] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.037 NEW_FUNC[1/636]: 0x43b1a8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:07:49.037 NEW_FUNC[2/636]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:49.037 #3 NEW cov: 10758 ft: 10714 corp: 2/10b lim: 80 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CMP- DE: "\001\226K\315E\033\022b"- 00:07:49.296 [2024-12-02 15:23:34.917323] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.296 #9 NEW cov: 10772 ft: 13517 corp: 3/19b lim: 80 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 PersAutoDict- DE: "\001\226K\315E\033\022b"- 00:07:49.296 [2024-12-02 15:23:35.111784] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.554 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.554 #10 NEW cov: 10789 ft: 14287 corp: 4/28b lim: 80 exec/s: 0 rss: 70Mb L: 9/9 MS: 1 ChangeBit- 00:07:49.554 [2024-12-02 15:23:35.296514] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.812 #11 NEW cov: 10789 ft: 14873 corp: 5/38b lim: 80 exec/s: 11 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:07:49.812 [2024-12-02 15:23:35.478959] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:49.812 #12 NEW cov: 10792 ft: 15059 corp: 6/47b lim: 80 exec/s: 12 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:50.071 [2024-12-02 15:23:35.661548] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.071 #13 NEW cov: 10792 ft: 15142 corp: 7/57b lim: 80 exec/s: 13 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:07:50.071 [2024-12-02 15:23:35.845998] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.330 #14 NEW cov: 10792 ft: 15227 corp: 8/66b lim: 80 exec/s: 14 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:50.330 [2024-12-02 15:23:36.031168] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.330 #15 NEW cov: 10792 ft: 15971 corp: 9/77b lim: 80 exec/s: 15 rss: 70Mb L: 11/11 MS: 1 InsertByte- 00:07:50.604 [2024-12-02 15:23:36.215573] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.604 #16 NEW cov: 10799 ft: 16533 corp: 10/87b lim: 80 exec/s: 16 rss: 70Mb L: 10/11 MS: 1 ChangeBit- 00:07:50.604 [2024-12-02 15:23:36.396386] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:50.864 #17 NEW cov: 10799 ft: 16735 corp: 11/105b lim: 80 exec/s: 8 rss: 70Mb L: 18/18 MS: 1 CrossOver- 00:07:50.864 #17 DONE cov: 10799 ft: 16735 corp: 11/105b lim: 80 exec/s: 8 rss: 70Mb 00:07:50.864 ###### Recommended dictionary. ###### 00:07:50.864 "\001\226K\315E\033\022b" # Uses: 1 00:07:50.864 ###### End of recommended dictionary. ###### 00:07:50.864 Done 17 runs in 2 second(s) 00:07:51.123 15:23:36 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:07:51.123 15:23:36 -- ../common.sh@72 -- # (( i++ )) 00:07:51.123 15:23:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.123 15:23:36 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:51.123 15:23:36 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:07:51.123 15:23:36 -- vfio/run.sh@23 -- # local timen=1 00:07:51.123 15:23:36 -- vfio/run.sh@24 -- # local core=0x1 00:07:51.123 15:23:36 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:51.123 15:23:36 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:07:51.123 15:23:36 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:07:51.123 15:23:36 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:07:51.123 15:23:36 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:07:51.124 15:23:36 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:51.124 15:23:36 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:07:51.124 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:51.124 15:23:36 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:07:51.124 [2024-12-02 15:23:36.798725] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:51.124 [2024-12-02 15:23:36.798795] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1608380 ] 00:07:51.124 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.124 [2024-12-02 15:23:36.868005] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.124 [2024-12-02 15:23:36.934884] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:51.124 [2024-12-02 15:23:36.935010] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.383 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.383 INFO: Seed: 1442317083 00:07:51.383 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:51.383 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:51.383 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:51.383 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.383 #2 INITED exec/s: 0 rss: 62Mb 00:07:51.383 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.383 This may also happen if the target rejected all inputs we tried so far 00:07:51.642 [2024-12-02 15:23:37.217793] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=323 offset=0 prot=0x3: Invalid argument 00:07:51.642 [2024-12-02 15:23:37.217826] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:07:51.642 [2024-12-02 15:23:37.217837] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:07:51.642 [2024-12-02 15:23:37.217854] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:51.901 NEW_FUNC[1/638]: 0x43b898 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:07:51.901 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:51.901 #12 NEW cov: 10780 ft: 10741 corp: 2/82b lim: 320 exec/s: 0 rss: 67Mb L: 81/81 MS: 5 ShuffleBytes-ChangeBinInt-ChangeBit-InsertByte-InsertRepeatedBytes- 00:07:52.161 #16 NEW cov: 10798 ft: 14130 corp: 3/136b lim: 320 exec/s: 0 rss: 69Mb L: 54/81 MS: 4 InsertByte-CrossOver-CrossOver-InsertRepeatedBytes- 00:07:52.161 [2024-12-02 15:23:37.893701] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:07:52.161 [2024-12-02 15:23:37.893744] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:07:52.161 [2024-12-02 15:23:37.893755] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:07:52.161 [2024-12-02 15:23:37.893772] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:52.420 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.420 #17 NEW cov: 10815 ft: 14893 corp: 4/204b lim: 320 exec/s: 0 rss: 70Mb L: 68/81 MS: 1 EraseBytes- 00:07:52.420 #18 NEW cov: 10815 ft: 15588 corp: 5/258b lim: 320 exec/s: 18 rss: 70Mb L: 54/81 MS: 1 ChangeByte- 00:07:52.679 #24 NEW cov: 10815 ft: 15962 corp: 6/312b lim: 320 exec/s: 24 rss: 70Mb L: 54/81 MS: 1 ChangeBit- 00:07:52.939 #25 NEW cov: 10815 ft: 16095 corp: 7/366b lim: 320 exec/s: 25 rss: 70Mb L: 54/81 MS: 1 ShuffleBytes- 00:07:53.198 #26 NEW cov: 10815 ft: 16673 corp: 8/402b lim: 320 exec/s: 26 rss: 70Mb L: 36/81 MS: 1 InsertRepeatedBytes- 00:07:53.198 #27 NEW cov: 10822 ft: 16858 corp: 9/483b lim: 320 exec/s: 27 rss: 70Mb L: 81/81 MS: 1 ChangeBit- 00:07:53.457 #28 NEW cov: 10822 ft: 16921 corp: 10/573b lim: 320 exec/s: 28 rss: 70Mb L: 90/90 MS: 1 CrossOver- 00:07:53.717 #31 NEW cov: 10822 ft: 17070 corp: 11/633b lim: 320 exec/s: 15 rss: 70Mb L: 60/90 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:53.717 #31 DONE cov: 10822 ft: 17070 corp: 11/633b lim: 320 exec/s: 15 rss: 70Mb 00:07:53.717 Done 31 runs in 2 second(s) 00:07:54.001 15:23:39 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:07:54.001 15:23:39 -- ../common.sh@72 -- # (( i++ )) 00:07:54.001 15:23:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.001 15:23:39 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:54.001 15:23:39 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:07:54.001 15:23:39 -- vfio/run.sh@23 -- # local timen=1 00:07:54.001 15:23:39 -- vfio/run.sh@24 -- # local core=0x1 00:07:54.001 15:23:39 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:54.001 15:23:39 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:07:54.001 15:23:39 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:07:54.001 15:23:39 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:07:54.001 15:23:39 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:07:54.001 15:23:39 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:54.001 15:23:39 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:07:54.001 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:54.001 15:23:39 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:07:54.001 [2024-12-02 15:23:39.606597] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:54.001 [2024-12-02 15:23:39.606647] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1608923 ] 00:07:54.001 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.001 [2024-12-02 15:23:39.675435] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.001 [2024-12-02 15:23:39.744380] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.001 [2024-12-02 15:23:39.744512] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.276 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.276 INFO: Seed: 4259326416 00:07:54.276 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:54.276 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:54.276 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:54.276 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.276 #2 INITED exec/s: 0 rss: 62Mb 00:07:54.276 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.276 This may also happen if the target rejected all inputs we tried so far 00:07:54.794 NEW_FUNC[1/632]: 0x43c118 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:07:54.794 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:54.794 #12 NEW cov: 10746 ft: 10685 corp: 2/101b lim: 320 exec/s: 0 rss: 67Mb L: 100/100 MS: 5 ChangeBit-CrossOver-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:55.053 #17 NEW cov: 10765 ft: 13838 corp: 3/162b lim: 320 exec/s: 0 rss: 69Mb L: 61/100 MS: 5 CMP-EraseBytes-ShuffleBytes-EraseBytes-InsertRepeatedBytes- DE: "&\000MW\320K\226\000"- 00:07:55.053 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.053 #18 NEW cov: 10782 ft: 15696 corp: 4/262b lim: 320 exec/s: 0 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:07:55.311 #19 NEW cov: 10782 ft: 16062 corp: 5/483b lim: 320 exec/s: 19 rss: 70Mb L: 221/221 MS: 1 InsertRepeatedBytes- 00:07:55.570 #20 NEW cov: 10782 ft: 16383 corp: 6/607b lim: 320 exec/s: 20 rss: 70Mb L: 124/221 MS: 1 InsertRepeatedBytes- 00:07:55.830 #21 NEW cov: 10782 ft: 16533 corp: 7/707b lim: 320 exec/s: 21 rss: 70Mb L: 100/221 MS: 1 CopyPart- 00:07:55.830 #27 NEW cov: 10785 ft: 16802 corp: 8/807b lim: 320 exec/s: 27 rss: 70Mb L: 100/221 MS: 1 ChangeBit- 00:07:56.089 #28 NEW cov: 10792 ft: 16989 corp: 9/1028b lim: 320 exec/s: 28 rss: 70Mb L: 221/221 MS: 1 CopyPart- 00:07:56.348 #29 NEW cov: 10792 ft: 17391 corp: 10/1128b lim: 320 exec/s: 14 rss: 70Mb L: 100/221 MS: 1 PersAutoDict- DE: "&\000MW\320K\226\000"- 00:07:56.348 #29 DONE cov: 10792 ft: 17391 corp: 10/1128b lim: 320 exec/s: 14 rss: 70Mb 00:07:56.348 ###### Recommended dictionary. ###### 00:07:56.348 "&\000MW\320K\226\000" # Uses: 2 00:07:56.348 ###### End of recommended dictionary. ###### 00:07:56.348 Done 29 runs in 2 second(s) 00:07:56.609 15:23:42 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:07:56.609 15:23:42 -- ../common.sh@72 -- # (( i++ )) 00:07:56.609 15:23:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.609 15:23:42 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:56.609 15:23:42 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:07:56.609 15:23:42 -- vfio/run.sh@23 -- # local timen=1 00:07:56.609 15:23:42 -- vfio/run.sh@24 -- # local core=0x1 00:07:56.609 15:23:42 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:56.609 15:23:42 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:07:56.609 15:23:42 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:07:56.609 15:23:42 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:07:56.609 15:23:42 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:07:56.609 15:23:42 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:56.609 15:23:42 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:07:56.609 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:56.609 15:23:42 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:07:56.609 [2024-12-02 15:23:42.282992] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:56.609 [2024-12-02 15:23:42.283070] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1609439 ] 00:07:56.609 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.609 [2024-12-02 15:23:42.355000] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.609 [2024-12-02 15:23:42.423173] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.609 [2024-12-02 15:23:42.423307] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.869 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.869 INFO: Seed: 2636355118 00:07:56.869 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:56.869 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:56.869 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:56.869 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.869 #2 INITED exec/s: 0 rss: 62Mb 00:07:56.869 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.869 This may also happen if the target rejected all inputs we tried so far 00:07:57.128 [2024-12-02 15:23:42.716768] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.128 [2024-12-02 15:23:42.716812] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.388 NEW_FUNC[1/638]: 0x43cb18 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:07:57.388 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:57.388 #31 NEW cov: 10776 ft: 10716 corp: 2/108b lim: 120 exec/s: 0 rss: 67Mb L: 107/107 MS: 4 CrossOver-CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:57.388 [2024-12-02 15:23:43.194092] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.388 [2024-12-02 15:23:43.194135] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.648 #36 NEW cov: 10795 ft: 13744 corp: 3/145b lim: 120 exec/s: 0 rss: 69Mb L: 37/107 MS: 5 ChangeBit-CrossOver-ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:07:57.648 [2024-12-02 15:23:43.386767] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.648 [2024-12-02 15:23:43.386799] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.907 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.907 #42 NEW cov: 10815 ft: 14547 corp: 4/182b lim: 120 exec/s: 0 rss: 70Mb L: 37/107 MS: 1 ChangeByte- 00:07:57.907 [2024-12-02 15:23:43.570024] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.907 [2024-12-02 15:23:43.570055] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.907 #43 NEW cov: 10815 ft: 15809 corp: 5/289b lim: 120 exec/s: 43 rss: 70Mb L: 107/107 MS: 1 CrossOver- 00:07:58.166 [2024-12-02 15:23:43.751502] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.166 [2024-12-02 15:23:43.751535] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.166 #47 NEW cov: 10815 ft: 16089 corp: 6/317b lim: 120 exec/s: 47 rss: 70Mb L: 28/107 MS: 4 CopyPart-CopyPart-CMP-InsertRepeatedBytes- DE: "\377\377\377\377\377\377\377t"- 00:07:58.166 [2024-12-02 15:23:43.943126] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.166 [2024-12-02 15:23:43.943159] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.426 #48 NEW cov: 10815 ft: 16677 corp: 7/355b lim: 120 exec/s: 48 rss: 70Mb L: 38/107 MS: 1 CrossOver- 00:07:58.426 [2024-12-02 15:23:44.125866] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.426 [2024-12-02 15:23:44.125898] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.426 #49 NEW cov: 10815 ft: 16969 corp: 8/388b lim: 120 exec/s: 49 rss: 70Mb L: 33/107 MS: 1 EraseBytes- 00:07:58.686 [2024-12-02 15:23:44.305380] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.686 [2024-12-02 15:23:44.305409] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.686 #50 NEW cov: 10815 ft: 17030 corp: 9/417b lim: 120 exec/s: 50 rss: 70Mb L: 29/107 MS: 1 EraseBytes- 00:07:58.686 [2024-12-02 15:23:44.487537] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.686 [2024-12-02 15:23:44.487566] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.946 #51 NEW cov: 10822 ft: 17208 corp: 10/450b lim: 120 exec/s: 51 rss: 70Mb L: 33/107 MS: 1 ChangeByte- 00:07:58.946 [2024-12-02 15:23:44.670878] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.946 [2024-12-02 15:23:44.670907] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:59.207 #52 NEW cov: 10822 ft: 17243 corp: 11/557b lim: 120 exec/s: 26 rss: 70Mb L: 107/107 MS: 1 ChangeBit- 00:07:59.207 #52 DONE cov: 10822 ft: 17243 corp: 11/557b lim: 120 exec/s: 26 rss: 70Mb 00:07:59.207 ###### Recommended dictionary. ###### 00:07:59.207 "\377\377\377\377\377\377\377t" # Uses: 0 00:07:59.207 ###### End of recommended dictionary. ###### 00:07:59.207 Done 52 runs in 2 second(s) 00:07:59.465 15:23:45 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:07:59.465 15:23:45 -- ../common.sh@72 -- # (( i++ )) 00:07:59.465 15:23:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.465 15:23:45 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:59.465 15:23:45 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:07:59.465 15:23:45 -- vfio/run.sh@23 -- # local timen=1 00:07:59.465 15:23:45 -- vfio/run.sh@24 -- # local core=0x1 00:07:59.465 15:23:45 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:59.465 15:23:45 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:07:59.465 15:23:45 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:07:59.465 15:23:45 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:07:59.465 15:23:45 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:07:59.465 15:23:45 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:59.465 15:23:45 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:07:59.465 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:59.465 15:23:45 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:07:59.465 [2024-12-02 15:23:45.081053] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:59.465 [2024-12-02 15:23:45.081119] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1609843 ] 00:07:59.465 EAL: No free 2048 kB hugepages reported on node 1 00:07:59.465 [2024-12-02 15:23:45.154275] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.465 [2024-12-02 15:23:45.224505] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:59.465 [2024-12-02 15:23:45.224635] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.723 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.723 INFO: Seed: 1144384661 00:07:59.723 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:59.723 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:59.723 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:59.723 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.723 #2 INITED exec/s: 0 rss: 62Mb 00:07:59.723 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.723 This may also happen if the target rejected all inputs we tried so far 00:07:59.723 [2024-12-02 15:23:45.486824] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:59.723 [2024-12-02 15:23:45.486865] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.240 NEW_FUNC[1/638]: 0x43d808 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:00.240 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:00.240 #16 NEW cov: 10769 ft: 10491 corp: 2/17b lim: 90 exec/s: 0 rss: 67Mb L: 16/16 MS: 4 CopyPart-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:00.240 [2024-12-02 15:23:45.888671] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.240 [2024-12-02 15:23:45.888718] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.240 #17 NEW cov: 10783 ft: 13151 corp: 3/33b lim: 90 exec/s: 0 rss: 69Mb L: 16/16 MS: 1 CopyPart- 00:08:00.240 [2024-12-02 15:23:46.002539] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.240 [2024-12-02 15:23:46.002574] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.499 #22 NEW cov: 10783 ft: 14214 corp: 4/54b lim: 90 exec/s: 0 rss: 70Mb L: 21/21 MS: 5 CrossOver-InsertByte-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:00.499 [2024-12-02 15:23:46.125443] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.499 [2024-12-02 15:23:46.125478] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.499 #23 NEW cov: 10783 ft: 14707 corp: 5/70b lim: 90 exec/s: 0 rss: 70Mb L: 16/21 MS: 1 ChangeBinInt- 00:08:00.499 [2024-12-02 15:23:46.238257] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.499 [2024-12-02 15:23:46.238292] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.499 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:00.499 #24 NEW cov: 10800 ft: 15025 corp: 6/154b lim: 90 exec/s: 0 rss: 70Mb L: 84/84 MS: 1 InsertRepeatedBytes- 00:08:00.759 [2024-12-02 15:23:46.351047] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.759 [2024-12-02 15:23:46.351082] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.759 #27 NEW cov: 10800 ft: 15815 corp: 7/195b lim: 90 exec/s: 0 rss: 70Mb L: 41/84 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:00.759 [2024-12-02 15:23:46.463754] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.759 [2024-12-02 15:23:46.463793] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:00.759 #36 NEW cov: 10800 ft: 15869 corp: 8/268b lim: 90 exec/s: 36 rss: 70Mb L: 73/84 MS: 4 ShuffleBytes-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:00.759 [2024-12-02 15:23:46.577657] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:00.759 [2024-12-02 15:23:46.577693] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.018 #41 NEW cov: 10800 ft: 16236 corp: 9/311b lim: 90 exec/s: 41 rss: 70Mb L: 43/84 MS: 5 ShuffleBytes-ChangeBit-CrossOver-CrossOver-InsertRepeatedBytes- 00:08:01.018 [2024-12-02 15:23:46.691410] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.018 [2024-12-02 15:23:46.691444] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.018 #47 NEW cov: 10800 ft: 16334 corp: 10/355b lim: 90 exec/s: 47 rss: 70Mb L: 44/84 MS: 1 InsertRepeatedBytes- 00:08:01.018 [2024-12-02 15:23:46.804309] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.018 [2024-12-02 15:23:46.804345] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.278 #48 NEW cov: 10800 ft: 16607 corp: 11/397b lim: 90 exec/s: 48 rss: 70Mb L: 42/84 MS: 1 InsertByte- 00:08:01.278 [2024-12-02 15:23:46.918103] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.278 [2024-12-02 15:23:46.918148] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.278 #49 NEW cov: 10800 ft: 16713 corp: 12/413b lim: 90 exec/s: 49 rss: 70Mb L: 16/84 MS: 1 ChangeBit- 00:08:01.278 [2024-12-02 15:23:47.033904] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.278 [2024-12-02 15:23:47.033939] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.537 #50 NEW cov: 10800 ft: 16734 corp: 13/429b lim: 90 exec/s: 50 rss: 70Mb L: 16/84 MS: 1 ShuffleBytes- 00:08:01.537 [2024-12-02 15:23:47.146561] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.537 [2024-12-02 15:23:47.146594] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.537 #51 NEW cov: 10800 ft: 16932 corp: 14/471b lim: 90 exec/s: 51 rss: 70Mb L: 42/84 MS: 1 CrossOver- 00:08:01.537 [2024-12-02 15:23:47.258360] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.537 [2024-12-02 15:23:47.258394] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.537 #52 NEW cov: 10807 ft: 17098 corp: 15/544b lim: 90 exec/s: 52 rss: 70Mb L: 73/84 MS: 1 ChangeBit- 00:08:01.797 [2024-12-02 15:23:47.372232] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:01.797 [2024-12-02 15:23:47.372268] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:01.797 #53 NEW cov: 10807 ft: 17108 corp: 16/586b lim: 90 exec/s: 26 rss: 70Mb L: 42/84 MS: 1 ChangeBit- 00:08:01.797 #53 DONE cov: 10807 ft: 17108 corp: 16/586b lim: 90 exec/s: 26 rss: 70Mb 00:08:01.797 Done 53 runs in 2 second(s) 00:08:02.056 15:23:47 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:02.056 15:23:47 -- ../common.sh@72 -- # (( i++ )) 00:08:02.056 15:23:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.056 15:23:47 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:02.056 00:08:02.056 real 0m19.579s 00:08:02.056 user 0m27.445s 00:08:02.056 sys 0m1.815s 00:08:02.056 15:23:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:02.056 15:23:47 -- common/autotest_common.sh@10 -- # set +x 00:08:02.056 ************************************ 00:08:02.056 END TEST vfio_fuzz 00:08:02.056 ************************************ 00:08:02.056 00:08:02.056 real 1m25.155s 00:08:02.056 user 2m8.482s 00:08:02.056 sys 0m9.905s 00:08:02.056 15:23:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:02.056 15:23:47 -- common/autotest_common.sh@10 -- # set +x 00:08:02.056 ************************************ 00:08:02.056 END TEST llvm_fuzz 00:08:02.056 ************************************ 00:08:02.056 15:23:47 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:02.056 15:23:47 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:02.056 15:23:47 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:02.056 15:23:47 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:02.056 15:23:47 -- common/autotest_common.sh@10 -- # set +x 00:08:02.056 15:23:47 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:02.056 15:23:47 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:02.056 15:23:47 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:02.056 15:23:47 -- common/autotest_common.sh@10 -- # set +x 00:08:08.629 INFO: APP EXITING 00:08:08.629 INFO: killing all VMs 00:08:08.629 INFO: killing vhost app 00:08:08.629 INFO: EXIT DONE 00:08:11.165 Waiting for block devices as requested 00:08:11.165 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:11.165 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:11.425 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:11.425 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:11.425 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:11.684 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:11.684 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:11.684 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:11.684 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:11.943 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:11.943 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:11.943 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:12.202 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:12.202 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:12.202 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:12.461 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:12.461 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:15.751 Cleaning 00:08:15.751 Removing: /dev/shm/spdk_tgt_trace.pid1571511 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1569032 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1570293 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1571511 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1572308 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1572634 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1572972 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1573315 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1573646 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1573935 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1574220 00:08:15.751 Removing: /var/run/dpdk/spdk_pid1574547 00:08:16.010 Removing: /var/run/dpdk/spdk_pid1575404 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1578634 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1579042 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1579346 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1579799 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1580597 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1580629 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1581195 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1581222 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1581509 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1581781 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1582010 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1582091 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1582722 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1582900 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1583064 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1583369 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1583679 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1583699 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1583873 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1584043 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1584312 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1584580 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1584871 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1585139 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1585426 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1585698 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1585916 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1586083 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1586300 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1586573 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1586855 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1587129 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1587410 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1587678 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1587972 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1588143 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1588360 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1588556 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1588840 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1589106 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1589400 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1589666 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1589949 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1590135 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1590348 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1590528 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1590811 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1591083 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1591364 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1591632 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1591928 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1592164 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1592376 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1592568 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1592802 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1593070 00:08:16.011 Removing: /var/run/dpdk/spdk_pid1593360 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1593628 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1593910 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1593996 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1594420 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1595097 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1595642 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1596127 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1596475 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1597021 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1597558 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1597917 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1598396 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1598940 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1599330 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1599780 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1600317 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1600800 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1601154 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1601699 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1602237 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1602557 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1603077 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1603616 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1603991 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1604452 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1604990 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1605343 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1605823 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1606364 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1606992 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1607397 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1607833 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1608380 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1608923 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1609439 00:08:16.270 Removing: /var/run/dpdk/spdk_pid1609843 00:08:16.270 Clean 00:08:16.529 killing process with pid 1521418 00:08:20.724 killing process with pid 1521415 00:08:20.724 killing process with pid 1521417 00:08:20.724 killing process with pid 1521416 00:08:20.724 15:24:05 -- common/autotest_common.sh@1446 -- # return 0 00:08:20.724 15:24:05 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:08:20.724 15:24:05 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:20.724 15:24:05 -- common/autotest_common.sh@10 -- # set +x 00:08:20.724 15:24:06 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:08:20.724 15:24:06 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:20.724 15:24:06 -- common/autotest_common.sh@10 -- # set +x 00:08:20.724 15:24:06 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:20.724 15:24:06 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:20.724 15:24:06 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:20.724 15:24:06 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:08:20.724 15:24:06 -- spdk/autotest.sh@383 -- # hostname 00:08:20.724 15:24:06 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:08:20.724 geninfo: WARNING: invalid characters removed from testname! 00:08:21.293 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:08:21.293 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:08:21.293 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:08:31.279 15:24:17 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:39.402 15:24:23 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:43.593 15:24:28 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:47.894 15:24:33 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:52.087 15:24:37 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:57.363 15:24:42 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:01.553 15:24:47 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:01.553 15:24:47 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:01.553 15:24:47 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:01.553 15:24:47 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:01.813 15:24:47 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:01.813 15:24:47 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:01.813 15:24:47 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:01.813 15:24:47 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:01.813 15:24:47 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:01.813 15:24:47 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:01.813 15:24:47 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:01.813 15:24:47 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:01.813 15:24:47 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:01.813 15:24:47 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:01.813 15:24:47 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:01.813 15:24:47 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:01.813 15:24:47 -- scripts/common.sh@343 -- $ case "$op" in 00:09:01.813 15:24:47 -- scripts/common.sh@344 -- $ : 1 00:09:01.813 15:24:47 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:01.813 15:24:47 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:01.813 15:24:47 -- scripts/common.sh@364 -- $ decimal 1 00:09:01.813 15:24:47 -- scripts/common.sh@352 -- $ local d=1 00:09:01.813 15:24:47 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:01.813 15:24:47 -- scripts/common.sh@354 -- $ echo 1 00:09:01.813 15:24:47 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:01.813 15:24:47 -- scripts/common.sh@365 -- $ decimal 2 00:09:01.813 15:24:47 -- scripts/common.sh@352 -- $ local d=2 00:09:01.813 15:24:47 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:01.813 15:24:47 -- scripts/common.sh@354 -- $ echo 2 00:09:01.813 15:24:47 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:01.813 15:24:47 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:01.813 15:24:47 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:01.814 15:24:47 -- scripts/common.sh@367 -- $ return 0 00:09:01.814 15:24:47 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:01.814 15:24:47 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:01.814 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:01.814 --rc genhtml_branch_coverage=1 00:09:01.814 --rc genhtml_function_coverage=1 00:09:01.814 --rc genhtml_legend=1 00:09:01.814 --rc geninfo_all_blocks=1 00:09:01.814 --rc geninfo_unexecuted_blocks=1 00:09:01.814 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:01.814 ' 00:09:01.814 15:24:47 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:01.814 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:01.814 --rc genhtml_branch_coverage=1 00:09:01.814 --rc genhtml_function_coverage=1 00:09:01.814 --rc genhtml_legend=1 00:09:01.814 --rc geninfo_all_blocks=1 00:09:01.814 --rc geninfo_unexecuted_blocks=1 00:09:01.814 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:01.814 ' 00:09:01.814 15:24:47 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:01.814 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:01.814 --rc genhtml_branch_coverage=1 00:09:01.814 --rc genhtml_function_coverage=1 00:09:01.814 --rc genhtml_legend=1 00:09:01.814 --rc geninfo_all_blocks=1 00:09:01.814 --rc geninfo_unexecuted_blocks=1 00:09:01.814 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:01.814 ' 00:09:01.814 15:24:47 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:01.814 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:01.814 --rc genhtml_branch_coverage=1 00:09:01.814 --rc genhtml_function_coverage=1 00:09:01.814 --rc genhtml_legend=1 00:09:01.814 --rc geninfo_all_blocks=1 00:09:01.814 --rc geninfo_unexecuted_blocks=1 00:09:01.814 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:01.814 ' 00:09:01.814 15:24:47 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:01.814 15:24:47 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:01.814 15:24:47 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:01.814 15:24:47 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:01.814 15:24:47 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:01.814 15:24:47 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:01.814 15:24:47 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:01.814 15:24:47 -- paths/export.sh@5 -- $ export PATH 00:09:01.814 15:24:47 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:01.814 15:24:47 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:01.814 15:24:47 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:01.814 15:24:47 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1733149487.XXXXXX 00:09:01.814 15:24:47 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1733149487.3yfW5O 00:09:01.814 15:24:47 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:01.814 15:24:47 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:09:01.814 15:24:47 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:01.814 15:24:47 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:01.814 15:24:47 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:01.814 15:24:47 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:01.814 15:24:47 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:01.814 15:24:47 -- common/autotest_common.sh@10 -- $ set +x 00:09:01.814 15:24:47 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:01.814 15:24:47 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:01.814 15:24:47 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:01.814 15:24:47 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:01.814 15:24:47 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:01.814 15:24:47 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:01.814 15:24:47 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:01.814 15:24:47 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:01.814 15:24:47 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:01.814 15:24:47 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:01.814 15:24:47 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:01.814 + [[ -n 1478011 ]] 00:09:01.814 + sudo kill 1478011 00:09:01.824 [Pipeline] } 00:09:01.843 [Pipeline] // stage 00:09:01.849 [Pipeline] } 00:09:01.866 [Pipeline] // timeout 00:09:01.873 [Pipeline] } 00:09:01.889 [Pipeline] // catchError 00:09:01.896 [Pipeline] } 00:09:01.912 [Pipeline] // wrap 00:09:01.919 [Pipeline] } 00:09:01.932 [Pipeline] // catchError 00:09:01.943 [Pipeline] stage 00:09:01.947 [Pipeline] { (Epilogue) 00:09:01.961 [Pipeline] catchError 00:09:01.963 [Pipeline] { 00:09:01.977 [Pipeline] echo 00:09:01.979 Cleanup processes 00:09:01.986 [Pipeline] sh 00:09:02.273 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:02.273 1619834 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:02.288 [Pipeline] sh 00:09:02.576 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:02.576 ++ grep -v 'sudo pgrep' 00:09:02.576 ++ awk '{print $1}' 00:09:02.576 + sudo kill -9 00:09:02.576 + true 00:09:02.589 [Pipeline] sh 00:09:02.874 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:02.874 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:02.874 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:04.254 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:14.249 [Pipeline] sh 00:09:14.537 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:14.537 Artifacts sizes are good 00:09:14.553 [Pipeline] archiveArtifacts 00:09:14.560 Archiving artifacts 00:09:14.688 [Pipeline] sh 00:09:15.001 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:15.016 [Pipeline] cleanWs 00:09:15.025 [WS-CLEANUP] Deleting project workspace... 00:09:15.025 [WS-CLEANUP] Deferred wipeout is used... 00:09:15.032 [WS-CLEANUP] done 00:09:15.034 [Pipeline] } 00:09:15.053 [Pipeline] // catchError 00:09:15.065 [Pipeline] sh 00:09:15.349 + logger -p user.info -t JENKINS-CI 00:09:15.359 [Pipeline] } 00:09:15.373 [Pipeline] // stage 00:09:15.378 [Pipeline] } 00:09:15.394 [Pipeline] // node 00:09:15.401 [Pipeline] End of Pipeline 00:09:15.438 Finished: SUCCESS