00:00:00.000 Started by upstream project "autotest-nightly-lts" build number 2417 00:00:00.000 originally caused by: 00:00:00.000 Started by upstream project "nightly-trigger" build number 3678 00:00:00.000 originally caused by: 00:00:00.000 Started by timer 00:00:00.000 Started by timer 00:00:00.015 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.017 The recommended git tool is: git 00:00:00.017 using credential 00000000-0000-0000-0000-000000000002 00:00:00.020 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.034 Fetching changes from the remote Git repository 00:00:00.037 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.074 Using shallow fetch with depth 1 00:00:00.074 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.074 > git --version # timeout=10 00:00:00.091 > git --version # 'git version 2.39.2' 00:00:00.091 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.121 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.121 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.009 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.021 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.034 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:03.034 > git config core.sparsecheckout # timeout=10 00:00:03.044 > git read-tree -mu HEAD # timeout=10 00:00:03.059 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:03.080 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:03.080 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:03.164 [Pipeline] Start of Pipeline 00:00:03.177 [Pipeline] library 00:00:03.179 Loading library shm_lib@master 00:00:03.179 Library shm_lib@master is cached. Copying from home. 00:00:03.197 [Pipeline] node 00:00:03.211 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.213 [Pipeline] { 00:00:03.223 [Pipeline] catchError 00:00:03.225 [Pipeline] { 00:00:03.238 [Pipeline] wrap 00:00:03.247 [Pipeline] { 00:00:03.255 [Pipeline] stage 00:00:03.258 [Pipeline] { (Prologue) 00:00:03.475 [Pipeline] sh 00:00:03.756 + logger -p user.info -t JENKINS-CI 00:00:03.775 [Pipeline] echo 00:00:03.777 Node: WFP20 00:00:03.783 [Pipeline] sh 00:00:04.077 [Pipeline] setCustomBuildProperty 00:00:04.087 [Pipeline] echo 00:00:04.089 Cleanup processes 00:00:04.094 [Pipeline] sh 00:00:04.378 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.378 3059800 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.392 [Pipeline] sh 00:00:04.671 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.671 ++ grep -v 'sudo pgrep' 00:00:04.671 ++ awk '{print $1}' 00:00:04.671 + sudo kill -9 00:00:04.671 + true 00:00:04.685 [Pipeline] cleanWs 00:00:04.694 [WS-CLEANUP] Deleting project workspace... 00:00:04.694 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.701 [WS-CLEANUP] done 00:00:04.705 [Pipeline] setCustomBuildProperty 00:00:04.720 [Pipeline] sh 00:00:05.001 + sudo git config --global --replace-all safe.directory '*' 00:00:05.097 [Pipeline] httpRequest 00:00:05.598 [Pipeline] echo 00:00:05.600 Sorcerer 10.211.164.20 is alive 00:00:05.610 [Pipeline] retry 00:00:05.612 [Pipeline] { 00:00:05.627 [Pipeline] httpRequest 00:00:05.631 HttpMethod: GET 00:00:05.632 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.632 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.634 Response Code: HTTP/1.1 200 OK 00:00:05.635 Success: Status code 200 is in the accepted range: 200,404 00:00:05.635 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.013 [Pipeline] } 00:00:06.026 [Pipeline] // retry 00:00:06.032 [Pipeline] sh 00:00:06.315 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:06.331 [Pipeline] httpRequest 00:00:07.342 [Pipeline] echo 00:00:07.344 Sorcerer 10.211.164.20 is alive 00:00:07.352 [Pipeline] retry 00:00:07.353 [Pipeline] { 00:00:07.367 [Pipeline] httpRequest 00:00:07.371 HttpMethod: GET 00:00:07.371 URL: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:07.372 Sending request to url: http://10.211.164.20/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:07.374 Response Code: HTTP/1.1 200 OK 00:00:07.374 Success: Status code 200 is in the accepted range: 200,404 00:00:07.375 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:25.923 [Pipeline] } 00:00:25.946 [Pipeline] // retry 00:00:25.955 [Pipeline] sh 00:00:26.240 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:28.778 [Pipeline] sh 00:00:29.056 + git -C spdk log --oneline -n5 00:00:29.056 c13c99a5e test: Various fixes for Fedora40 00:00:29.056 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:00:29.056 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:00:29.056 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:00:29.056 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:00:29.066 [Pipeline] } 00:00:29.082 [Pipeline] // stage 00:00:29.090 [Pipeline] stage 00:00:29.092 [Pipeline] { (Prepare) 00:00:29.108 [Pipeline] writeFile 00:00:29.122 [Pipeline] sh 00:00:29.400 + logger -p user.info -t JENKINS-CI 00:00:29.411 [Pipeline] sh 00:00:29.691 + logger -p user.info -t JENKINS-CI 00:00:29.703 [Pipeline] sh 00:00:29.981 + cat autorun-spdk.conf 00:00:29.981 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:29.981 SPDK_TEST_FUZZER_SHORT=1 00:00:29.981 SPDK_TEST_FUZZER=1 00:00:29.981 SPDK_RUN_UBSAN=1 00:00:29.988 RUN_NIGHTLY=1 00:00:29.994 [Pipeline] readFile 00:00:30.021 [Pipeline] withEnv 00:00:30.023 [Pipeline] { 00:00:30.036 [Pipeline] sh 00:00:30.316 + set -ex 00:00:30.316 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:30.316 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:30.316 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:30.316 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:30.316 ++ SPDK_TEST_FUZZER=1 00:00:30.316 ++ SPDK_RUN_UBSAN=1 00:00:30.316 ++ RUN_NIGHTLY=1 00:00:30.316 + case $SPDK_TEST_NVMF_NICS in 00:00:30.316 + DRIVERS= 00:00:30.316 + [[ -n '' ]] 00:00:30.316 + exit 0 00:00:30.324 [Pipeline] } 00:00:30.339 [Pipeline] // withEnv 00:00:30.344 [Pipeline] } 00:00:30.358 [Pipeline] // stage 00:00:30.367 [Pipeline] catchError 00:00:30.369 [Pipeline] { 00:00:30.383 [Pipeline] timeout 00:00:30.384 Timeout set to expire in 30 min 00:00:30.386 [Pipeline] { 00:00:30.402 [Pipeline] stage 00:00:30.405 [Pipeline] { (Tests) 00:00:30.422 [Pipeline] sh 00:00:30.765 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:30.765 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:30.765 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:30.765 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:30.765 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:30.765 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:30.765 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:30.765 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:30.765 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:30.765 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:30.765 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:00:30.765 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:30.765 + source /etc/os-release 00:00:30.765 ++ NAME='Fedora Linux' 00:00:30.765 ++ VERSION='39 (Cloud Edition)' 00:00:30.765 ++ ID=fedora 00:00:30.765 ++ VERSION_ID=39 00:00:30.765 ++ VERSION_CODENAME= 00:00:30.765 ++ PLATFORM_ID=platform:f39 00:00:30.765 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:00:30.765 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:30.765 ++ LOGO=fedora-logo-icon 00:00:30.765 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:00:30.765 ++ HOME_URL=https://fedoraproject.org/ 00:00:30.765 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:00:30.765 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:30.765 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:30.765 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:30.765 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:00:30.765 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:30.765 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:00:30.765 ++ SUPPORT_END=2024-11-12 00:00:30.765 ++ VARIANT='Cloud Edition' 00:00:30.765 ++ VARIANT_ID=cloud 00:00:30.765 + uname -a 00:00:30.765 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:00:30.765 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:34.046 Hugepages 00:00:34.046 node hugesize free / total 00:00:34.046 node0 1048576kB 0 / 0 00:00:34.046 node0 2048kB 0 / 0 00:00:34.047 node1 1048576kB 0 / 0 00:00:34.047 node1 2048kB 0 / 0 00:00:34.047 00:00:34.047 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:34.047 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:34.047 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:34.047 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:00:34.047 + rm -f /tmp/spdk-ld-path 00:00:34.047 + source autorun-spdk.conf 00:00:34.047 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:34.047 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:34.047 ++ SPDK_TEST_FUZZER=1 00:00:34.047 ++ SPDK_RUN_UBSAN=1 00:00:34.047 ++ RUN_NIGHTLY=1 00:00:34.047 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:34.047 + [[ -n '' ]] 00:00:34.047 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:34.047 + for M in /var/spdk/build-*-manifest.txt 00:00:34.047 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:00:34.047 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:34.047 + for M in /var/spdk/build-*-manifest.txt 00:00:34.047 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:34.047 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:34.047 + for M in /var/spdk/build-*-manifest.txt 00:00:34.047 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:34.047 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:34.047 ++ uname 00:00:34.047 + [[ Linux == \L\i\n\u\x ]] 00:00:34.047 + sudo dmesg -T 00:00:34.047 + sudo dmesg --clear 00:00:34.047 + dmesg_pid=3060690 00:00:34.047 + [[ Fedora Linux == FreeBSD ]] 00:00:34.047 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:34.047 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:34.047 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:34.047 + [[ -x /usr/src/fio-static/fio ]] 00:00:34.047 + export FIO_BIN=/usr/src/fio-static/fio 00:00:34.047 + FIO_BIN=/usr/src/fio-static/fio 00:00:34.047 + sudo dmesg -Tw 00:00:34.047 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:34.047 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:34.047 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:34.047 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:34.047 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:34.047 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:34.047 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:34.047 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:34.047 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:34.047 Test configuration: 00:00:34.047 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:34.047 SPDK_TEST_FUZZER_SHORT=1 00:00:34.047 SPDK_TEST_FUZZER=1 00:00:34.047 SPDK_RUN_UBSAN=1 00:00:34.047 RUN_NIGHTLY=1 09:23:56 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:00:34.047 09:23:56 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:34.047 09:23:56 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:34.047 09:23:56 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:34.047 09:23:56 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:34.047 09:23:56 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:34.047 09:23:56 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:34.047 09:23:56 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:34.047 09:23:56 -- paths/export.sh@5 -- $ export PATH 00:00:34.047 09:23:56 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:34.047 09:23:56 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:34.047 09:23:56 -- common/autobuild_common.sh@440 -- $ date +%s 00:00:34.047 09:23:56 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732868636.XXXXXX 00:00:34.047 09:23:56 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732868636.3IVw6c 00:00:34.047 09:23:56 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:00:34.047 09:23:56 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:00:34.047 09:23:56 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:34.047 09:23:56 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:34.047 09:23:56 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:34.047 09:23:56 -- common/autobuild_common.sh@456 -- $ get_config_params 00:00:34.047 09:23:56 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:00:34.047 09:23:56 -- common/autotest_common.sh@10 -- $ set +x 00:00:34.047 09:23:56 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:00:34.047 09:23:56 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:34.047 09:23:56 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:34.047 09:23:56 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:34.047 09:23:56 -- spdk/autobuild.sh@16 -- $ date -u 00:00:34.047 Fri Nov 29 08:23:56 AM UTC 2024 00:00:34.047 09:23:56 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:34.047 LTS-67-gc13c99a5e 00:00:34.047 09:23:56 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:34.047 09:23:56 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:34.047 09:23:56 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:34.047 09:23:56 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:00:34.047 09:23:56 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:34.047 09:23:56 -- common/autotest_common.sh@10 -- $ set +x 00:00:34.047 ************************************ 00:00:34.047 START TEST ubsan 00:00:34.047 ************************************ 00:00:34.047 09:23:56 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:00:34.047 using ubsan 00:00:34.047 00:00:34.047 real 0m0.000s 00:00:34.047 user 0m0.000s 00:00:34.047 sys 0m0.000s 00:00:34.047 09:23:56 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:00:34.047 09:23:56 -- common/autotest_common.sh@10 -- $ set +x 00:00:34.047 ************************************ 00:00:34.047 END TEST ubsan 00:00:34.047 ************************************ 00:00:34.047 09:23:56 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:34.048 09:23:56 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:34.048 09:23:56 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:34.048 09:23:56 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:34.048 09:23:56 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:34.048 09:23:56 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:34.048 09:23:56 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:00:34.048 09:23:56 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:00:34.048 09:23:56 -- common/autotest_common.sh@10 -- $ set +x 00:00:34.048 ************************************ 00:00:34.048 START TEST autobuild_llvm_precompile 00:00:34.048 ************************************ 00:00:34.048 09:23:56 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:00:34.048 09:23:56 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:34.048 09:23:56 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:00:34.048 Target: x86_64-redhat-linux-gnu 00:00:34.048 Thread model: posix 00:00:34.048 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:34.048 09:23:56 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:00:34.048 09:23:56 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:00:34.048 09:23:56 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:00:34.048 09:23:56 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:00:34.048 09:23:56 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:00:34.048 09:23:56 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:34.048 09:23:56 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:00:34.048 09:23:56 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:00:34.048 09:23:56 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:00:34.048 09:23:56 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:00:34.307 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:34.307 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:34.566 Using 'verbs' RDMA provider 00:00:50.381 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:02.582 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:02.582 Creating mk/config.mk...done. 00:01:02.582 Creating mk/cc.flags.mk...done. 00:01:02.582 Type 'make' to build. 00:01:02.582 00:01:02.582 real 0m28.401s 00:01:02.582 user 0m12.574s 00:01:02.582 sys 0m15.207s 00:01:02.582 09:24:25 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:02.582 09:24:25 -- common/autotest_common.sh@10 -- $ set +x 00:01:02.582 ************************************ 00:01:02.582 END TEST autobuild_llvm_precompile 00:01:02.582 ************************************ 00:01:02.582 09:24:25 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:02.582 09:24:25 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:02.582 09:24:25 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:02.582 09:24:25 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:02.582 09:24:25 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:02.582 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:02.582 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:03.149 Using 'verbs' RDMA provider 00:01:15.925 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:28.130 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:28.130 Creating mk/config.mk...done. 00:01:28.130 Creating mk/cc.flags.mk...done. 00:01:28.130 Type 'make' to build. 00:01:28.130 09:24:49 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:01:28.130 09:24:49 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:28.130 09:24:49 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:28.130 09:24:49 -- common/autotest_common.sh@10 -- $ set +x 00:01:28.130 ************************************ 00:01:28.130 START TEST make 00:01:28.130 ************************************ 00:01:28.130 09:24:49 -- common/autotest_common.sh@1114 -- $ make -j112 00:01:28.130 make[1]: Nothing to be done for 'all'. 00:01:28.697 The Meson build system 00:01:28.697 Version: 1.5.0 00:01:28.697 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:28.697 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:28.697 Build type: native build 00:01:28.697 Project name: libvfio-user 00:01:28.697 Project version: 0.0.1 00:01:28.697 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:01:28.697 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:01:28.697 Host machine cpu family: x86_64 00:01:28.697 Host machine cpu: x86_64 00:01:28.697 Run-time dependency threads found: YES 00:01:28.697 Library dl found: YES 00:01:28.697 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:28.697 Run-time dependency json-c found: YES 0.17 00:01:28.697 Run-time dependency cmocka found: YES 1.1.7 00:01:28.697 Program pytest-3 found: NO 00:01:28.697 Program flake8 found: NO 00:01:28.697 Program misspell-fixer found: NO 00:01:28.697 Program restructuredtext-lint found: NO 00:01:28.697 Program valgrind found: YES (/usr/bin/valgrind) 00:01:28.697 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:28.697 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:28.697 Compiler for C supports arguments -Wwrite-strings: YES 00:01:28.697 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:28.697 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:28.697 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:28.697 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:28.697 Build targets in project: 8 00:01:28.697 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:28.697 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:28.697 00:01:28.697 libvfio-user 0.0.1 00:01:28.697 00:01:28.697 User defined options 00:01:28.697 buildtype : debug 00:01:28.697 default_library: static 00:01:28.697 libdir : /usr/local/lib 00:01:28.697 00:01:28.697 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:29.265 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:29.265 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:29.265 [2/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:29.265 [3/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:29.265 [4/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:29.265 [5/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:29.265 [6/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:29.265 [7/36] Compiling C object samples/null.p/null.c.o 00:01:29.265 [8/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:29.265 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:29.265 [10/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:29.265 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:29.265 [12/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:29.265 [13/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:29.265 [14/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:29.265 [15/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:29.265 [16/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:29.265 [17/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:29.265 [18/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:29.265 [19/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:29.265 [20/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:29.265 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:29.265 [22/36] Compiling C object samples/server.p/server.c.o 00:01:29.265 [23/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:29.265 [24/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:29.265 [25/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:29.265 [26/36] Compiling C object samples/client.p/client.c.o 00:01:29.265 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:29.265 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:29.265 [29/36] Linking static target lib/libvfio-user.a 00:01:29.265 [30/36] Linking target samples/client 00:01:29.524 [31/36] Linking target test/unit_tests 00:01:29.524 [32/36] Linking target samples/shadow_ioeventfd_server 00:01:29.524 [33/36] Linking target samples/lspci 00:01:29.524 [34/36] Linking target samples/null 00:01:29.524 [35/36] Linking target samples/server 00:01:29.524 [36/36] Linking target samples/gpio-pci-idio-16 00:01:29.524 INFO: autodetecting backend as ninja 00:01:29.524 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:29.524 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:29.783 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:29.783 ninja: no work to do. 00:01:35.063 The Meson build system 00:01:35.063 Version: 1.5.0 00:01:35.063 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:35.063 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:35.063 Build type: native build 00:01:35.063 Program cat found: YES (/usr/bin/cat) 00:01:35.063 Project name: DPDK 00:01:35.063 Project version: 23.11.0 00:01:35.063 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:01:35.063 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:01:35.063 Host machine cpu family: x86_64 00:01:35.063 Host machine cpu: x86_64 00:01:35.063 Message: ## Building in Developer Mode ## 00:01:35.063 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:35.063 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:35.063 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:35.063 Program python3 found: YES (/usr/bin/python3) 00:01:35.063 Program cat found: YES (/usr/bin/cat) 00:01:35.063 Compiler for C supports arguments -march=native: YES 00:01:35.063 Checking for size of "void *" : 8 00:01:35.063 Checking for size of "void *" : 8 (cached) 00:01:35.063 Library m found: YES 00:01:35.063 Library numa found: YES 00:01:35.063 Has header "numaif.h" : YES 00:01:35.063 Library fdt found: NO 00:01:35.063 Library execinfo found: NO 00:01:35.063 Has header "execinfo.h" : YES 00:01:35.063 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:01:35.063 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:35.063 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:35.063 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:35.063 Run-time dependency openssl found: YES 3.1.1 00:01:35.063 Run-time dependency libpcap found: YES 1.10.4 00:01:35.063 Has header "pcap.h" with dependency libpcap: YES 00:01:35.063 Compiler for C supports arguments -Wcast-qual: YES 00:01:35.063 Compiler for C supports arguments -Wdeprecated: YES 00:01:35.063 Compiler for C supports arguments -Wformat: YES 00:01:35.063 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:35.063 Compiler for C supports arguments -Wformat-security: YES 00:01:35.063 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:35.063 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:35.063 Compiler for C supports arguments -Wnested-externs: YES 00:01:35.063 Compiler for C supports arguments -Wold-style-definition: YES 00:01:35.063 Compiler for C supports arguments -Wpointer-arith: YES 00:01:35.063 Compiler for C supports arguments -Wsign-compare: YES 00:01:35.063 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:35.063 Compiler for C supports arguments -Wundef: YES 00:01:35.063 Compiler for C supports arguments -Wwrite-strings: YES 00:01:35.063 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:35.063 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:35.063 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:35.063 Program objdump found: YES (/usr/bin/objdump) 00:01:35.063 Compiler for C supports arguments -mavx512f: YES 00:01:35.063 Checking if "AVX512 checking" compiles: YES 00:01:35.063 Fetching value of define "__SSE4_2__" : 1 00:01:35.063 Fetching value of define "__AES__" : 1 00:01:35.063 Fetching value of define "__AVX__" : 1 00:01:35.063 Fetching value of define "__AVX2__" : 1 00:01:35.063 Fetching value of define "__AVX512BW__" : 1 00:01:35.063 Fetching value of define "__AVX512CD__" : 1 00:01:35.063 Fetching value of define "__AVX512DQ__" : 1 00:01:35.063 Fetching value of define "__AVX512F__" : 1 00:01:35.063 Fetching value of define "__AVX512VL__" : 1 00:01:35.063 Fetching value of define "__PCLMUL__" : 1 00:01:35.064 Fetching value of define "__RDRND__" : 1 00:01:35.064 Fetching value of define "__RDSEED__" : 1 00:01:35.064 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:35.064 Fetching value of define "__znver1__" : (undefined) 00:01:35.064 Fetching value of define "__znver2__" : (undefined) 00:01:35.064 Fetching value of define "__znver3__" : (undefined) 00:01:35.064 Fetching value of define "__znver4__" : (undefined) 00:01:35.064 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:35.064 Message: lib/log: Defining dependency "log" 00:01:35.064 Message: lib/kvargs: Defining dependency "kvargs" 00:01:35.064 Message: lib/telemetry: Defining dependency "telemetry" 00:01:35.064 Checking for function "getentropy" : NO 00:01:35.064 Message: lib/eal: Defining dependency "eal" 00:01:35.064 Message: lib/ring: Defining dependency "ring" 00:01:35.064 Message: lib/rcu: Defining dependency "rcu" 00:01:35.064 Message: lib/mempool: Defining dependency "mempool" 00:01:35.064 Message: lib/mbuf: Defining dependency "mbuf" 00:01:35.064 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:35.064 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:35.064 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:35.064 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:35.064 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:35.064 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:35.064 Compiler for C supports arguments -mpclmul: YES 00:01:35.064 Compiler for C supports arguments -maes: YES 00:01:35.064 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:35.064 Compiler for C supports arguments -mavx512bw: YES 00:01:35.064 Compiler for C supports arguments -mavx512dq: YES 00:01:35.064 Compiler for C supports arguments -mavx512vl: YES 00:01:35.064 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:35.064 Compiler for C supports arguments -mavx2: YES 00:01:35.064 Compiler for C supports arguments -mavx: YES 00:01:35.064 Message: lib/net: Defining dependency "net" 00:01:35.064 Message: lib/meter: Defining dependency "meter" 00:01:35.064 Message: lib/ethdev: Defining dependency "ethdev" 00:01:35.064 Message: lib/pci: Defining dependency "pci" 00:01:35.064 Message: lib/cmdline: Defining dependency "cmdline" 00:01:35.064 Message: lib/hash: Defining dependency "hash" 00:01:35.064 Message: lib/timer: Defining dependency "timer" 00:01:35.064 Message: lib/compressdev: Defining dependency "compressdev" 00:01:35.064 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:35.064 Message: lib/dmadev: Defining dependency "dmadev" 00:01:35.064 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:35.064 Message: lib/power: Defining dependency "power" 00:01:35.064 Message: lib/reorder: Defining dependency "reorder" 00:01:35.064 Message: lib/security: Defining dependency "security" 00:01:35.064 Has header "linux/userfaultfd.h" : YES 00:01:35.064 Has header "linux/vduse.h" : YES 00:01:35.064 Message: lib/vhost: Defining dependency "vhost" 00:01:35.064 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:35.064 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:35.064 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:35.064 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:35.064 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:35.064 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:35.064 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:35.064 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:35.064 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:35.064 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:35.064 Program doxygen found: YES (/usr/local/bin/doxygen) 00:01:35.064 Configuring doxy-api-html.conf using configuration 00:01:35.064 Configuring doxy-api-man.conf using configuration 00:01:35.064 Program mandb found: YES (/usr/bin/mandb) 00:01:35.064 Program sphinx-build found: NO 00:01:35.064 Configuring rte_build_config.h using configuration 00:01:35.064 Message: 00:01:35.064 ================= 00:01:35.064 Applications Enabled 00:01:35.064 ================= 00:01:35.064 00:01:35.064 apps: 00:01:35.064 00:01:35.064 00:01:35.064 Message: 00:01:35.064 ================= 00:01:35.064 Libraries Enabled 00:01:35.064 ================= 00:01:35.064 00:01:35.064 libs: 00:01:35.064 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:35.064 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:35.064 cryptodev, dmadev, power, reorder, security, vhost, 00:01:35.064 00:01:35.064 Message: 00:01:35.064 =============== 00:01:35.064 Drivers Enabled 00:01:35.064 =============== 00:01:35.064 00:01:35.064 common: 00:01:35.064 00:01:35.064 bus: 00:01:35.064 pci, vdev, 00:01:35.064 mempool: 00:01:35.064 ring, 00:01:35.064 dma: 00:01:35.064 00:01:35.064 net: 00:01:35.064 00:01:35.064 crypto: 00:01:35.064 00:01:35.064 compress: 00:01:35.064 00:01:35.064 vdpa: 00:01:35.064 00:01:35.064 00:01:35.064 Message: 00:01:35.064 ================= 00:01:35.064 Content Skipped 00:01:35.064 ================= 00:01:35.064 00:01:35.064 apps: 00:01:35.064 dumpcap: explicitly disabled via build config 00:01:35.064 graph: explicitly disabled via build config 00:01:35.064 pdump: explicitly disabled via build config 00:01:35.064 proc-info: explicitly disabled via build config 00:01:35.064 test-acl: explicitly disabled via build config 00:01:35.064 test-bbdev: explicitly disabled via build config 00:01:35.064 test-cmdline: explicitly disabled via build config 00:01:35.064 test-compress-perf: explicitly disabled via build config 00:01:35.064 test-crypto-perf: explicitly disabled via build config 00:01:35.064 test-dma-perf: explicitly disabled via build config 00:01:35.064 test-eventdev: explicitly disabled via build config 00:01:35.064 test-fib: explicitly disabled via build config 00:01:35.064 test-flow-perf: explicitly disabled via build config 00:01:35.064 test-gpudev: explicitly disabled via build config 00:01:35.064 test-mldev: explicitly disabled via build config 00:01:35.064 test-pipeline: explicitly disabled via build config 00:01:35.064 test-pmd: explicitly disabled via build config 00:01:35.064 test-regex: explicitly disabled via build config 00:01:35.064 test-sad: explicitly disabled via build config 00:01:35.064 test-security-perf: explicitly disabled via build config 00:01:35.064 00:01:35.064 libs: 00:01:35.064 metrics: explicitly disabled via build config 00:01:35.064 acl: explicitly disabled via build config 00:01:35.064 bbdev: explicitly disabled via build config 00:01:35.064 bitratestats: explicitly disabled via build config 00:01:35.064 bpf: explicitly disabled via build config 00:01:35.064 cfgfile: explicitly disabled via build config 00:01:35.064 distributor: explicitly disabled via build config 00:01:35.064 efd: explicitly disabled via build config 00:01:35.064 eventdev: explicitly disabled via build config 00:01:35.064 dispatcher: explicitly disabled via build config 00:01:35.064 gpudev: explicitly disabled via build config 00:01:35.064 gro: explicitly disabled via build config 00:01:35.064 gso: explicitly disabled via build config 00:01:35.064 ip_frag: explicitly disabled via build config 00:01:35.064 jobstats: explicitly disabled via build config 00:01:35.064 latencystats: explicitly disabled via build config 00:01:35.064 lpm: explicitly disabled via build config 00:01:35.064 member: explicitly disabled via build config 00:01:35.064 pcapng: explicitly disabled via build config 00:01:35.064 rawdev: explicitly disabled via build config 00:01:35.064 regexdev: explicitly disabled via build config 00:01:35.064 mldev: explicitly disabled via build config 00:01:35.064 rib: explicitly disabled via build config 00:01:35.064 sched: explicitly disabled via build config 00:01:35.064 stack: explicitly disabled via build config 00:01:35.064 ipsec: explicitly disabled via build config 00:01:35.064 pdcp: explicitly disabled via build config 00:01:35.064 fib: explicitly disabled via build config 00:01:35.064 port: explicitly disabled via build config 00:01:35.064 pdump: explicitly disabled via build config 00:01:35.064 table: explicitly disabled via build config 00:01:35.064 pipeline: explicitly disabled via build config 00:01:35.064 graph: explicitly disabled via build config 00:01:35.064 node: explicitly disabled via build config 00:01:35.064 00:01:35.064 drivers: 00:01:35.064 common/cpt: not in enabled drivers build config 00:01:35.064 common/dpaax: not in enabled drivers build config 00:01:35.064 common/iavf: not in enabled drivers build config 00:01:35.064 common/idpf: not in enabled drivers build config 00:01:35.064 common/mvep: not in enabled drivers build config 00:01:35.064 common/octeontx: not in enabled drivers build config 00:01:35.064 bus/auxiliary: not in enabled drivers build config 00:01:35.064 bus/cdx: not in enabled drivers build config 00:01:35.064 bus/dpaa: not in enabled drivers build config 00:01:35.064 bus/fslmc: not in enabled drivers build config 00:01:35.064 bus/ifpga: not in enabled drivers build config 00:01:35.064 bus/platform: not in enabled drivers build config 00:01:35.064 bus/vmbus: not in enabled drivers build config 00:01:35.064 common/cnxk: not in enabled drivers build config 00:01:35.064 common/mlx5: not in enabled drivers build config 00:01:35.064 common/nfp: not in enabled drivers build config 00:01:35.065 common/qat: not in enabled drivers build config 00:01:35.065 common/sfc_efx: not in enabled drivers build config 00:01:35.065 mempool/bucket: not in enabled drivers build config 00:01:35.065 mempool/cnxk: not in enabled drivers build config 00:01:35.065 mempool/dpaa: not in enabled drivers build config 00:01:35.065 mempool/dpaa2: not in enabled drivers build config 00:01:35.065 mempool/octeontx: not in enabled drivers build config 00:01:35.065 mempool/stack: not in enabled drivers build config 00:01:35.065 dma/cnxk: not in enabled drivers build config 00:01:35.065 dma/dpaa: not in enabled drivers build config 00:01:35.065 dma/dpaa2: not in enabled drivers build config 00:01:35.065 dma/hisilicon: not in enabled drivers build config 00:01:35.065 dma/idxd: not in enabled drivers build config 00:01:35.065 dma/ioat: not in enabled drivers build config 00:01:35.065 dma/skeleton: not in enabled drivers build config 00:01:35.065 net/af_packet: not in enabled drivers build config 00:01:35.065 net/af_xdp: not in enabled drivers build config 00:01:35.065 net/ark: not in enabled drivers build config 00:01:35.065 net/atlantic: not in enabled drivers build config 00:01:35.065 net/avp: not in enabled drivers build config 00:01:35.065 net/axgbe: not in enabled drivers build config 00:01:35.065 net/bnx2x: not in enabled drivers build config 00:01:35.065 net/bnxt: not in enabled drivers build config 00:01:35.065 net/bonding: not in enabled drivers build config 00:01:35.065 net/cnxk: not in enabled drivers build config 00:01:35.065 net/cpfl: not in enabled drivers build config 00:01:35.065 net/cxgbe: not in enabled drivers build config 00:01:35.065 net/dpaa: not in enabled drivers build config 00:01:35.065 net/dpaa2: not in enabled drivers build config 00:01:35.065 net/e1000: not in enabled drivers build config 00:01:35.065 net/ena: not in enabled drivers build config 00:01:35.065 net/enetc: not in enabled drivers build config 00:01:35.065 net/enetfec: not in enabled drivers build config 00:01:35.065 net/enic: not in enabled drivers build config 00:01:35.065 net/failsafe: not in enabled drivers build config 00:01:35.065 net/fm10k: not in enabled drivers build config 00:01:35.065 net/gve: not in enabled drivers build config 00:01:35.065 net/hinic: not in enabled drivers build config 00:01:35.065 net/hns3: not in enabled drivers build config 00:01:35.065 net/i40e: not in enabled drivers build config 00:01:35.065 net/iavf: not in enabled drivers build config 00:01:35.065 net/ice: not in enabled drivers build config 00:01:35.065 net/idpf: not in enabled drivers build config 00:01:35.065 net/igc: not in enabled drivers build config 00:01:35.065 net/ionic: not in enabled drivers build config 00:01:35.065 net/ipn3ke: not in enabled drivers build config 00:01:35.065 net/ixgbe: not in enabled drivers build config 00:01:35.065 net/mana: not in enabled drivers build config 00:01:35.065 net/memif: not in enabled drivers build config 00:01:35.065 net/mlx4: not in enabled drivers build config 00:01:35.065 net/mlx5: not in enabled drivers build config 00:01:35.065 net/mvneta: not in enabled drivers build config 00:01:35.065 net/mvpp2: not in enabled drivers build config 00:01:35.065 net/netvsc: not in enabled drivers build config 00:01:35.065 net/nfb: not in enabled drivers build config 00:01:35.065 net/nfp: not in enabled drivers build config 00:01:35.065 net/ngbe: not in enabled drivers build config 00:01:35.065 net/null: not in enabled drivers build config 00:01:35.065 net/octeontx: not in enabled drivers build config 00:01:35.065 net/octeon_ep: not in enabled drivers build config 00:01:35.065 net/pcap: not in enabled drivers build config 00:01:35.065 net/pfe: not in enabled drivers build config 00:01:35.065 net/qede: not in enabled drivers build config 00:01:35.065 net/ring: not in enabled drivers build config 00:01:35.065 net/sfc: not in enabled drivers build config 00:01:35.065 net/softnic: not in enabled drivers build config 00:01:35.065 net/tap: not in enabled drivers build config 00:01:35.065 net/thunderx: not in enabled drivers build config 00:01:35.065 net/txgbe: not in enabled drivers build config 00:01:35.065 net/vdev_netvsc: not in enabled drivers build config 00:01:35.065 net/vhost: not in enabled drivers build config 00:01:35.065 net/virtio: not in enabled drivers build config 00:01:35.065 net/vmxnet3: not in enabled drivers build config 00:01:35.065 raw/*: missing internal dependency, "rawdev" 00:01:35.065 crypto/armv8: not in enabled drivers build config 00:01:35.065 crypto/bcmfs: not in enabled drivers build config 00:01:35.065 crypto/caam_jr: not in enabled drivers build config 00:01:35.065 crypto/ccp: not in enabled drivers build config 00:01:35.065 crypto/cnxk: not in enabled drivers build config 00:01:35.065 crypto/dpaa_sec: not in enabled drivers build config 00:01:35.065 crypto/dpaa2_sec: not in enabled drivers build config 00:01:35.065 crypto/ipsec_mb: not in enabled drivers build config 00:01:35.065 crypto/mlx5: not in enabled drivers build config 00:01:35.065 crypto/mvsam: not in enabled drivers build config 00:01:35.065 crypto/nitrox: not in enabled drivers build config 00:01:35.065 crypto/null: not in enabled drivers build config 00:01:35.065 crypto/octeontx: not in enabled drivers build config 00:01:35.065 crypto/openssl: not in enabled drivers build config 00:01:35.065 crypto/scheduler: not in enabled drivers build config 00:01:35.065 crypto/uadk: not in enabled drivers build config 00:01:35.065 crypto/virtio: not in enabled drivers build config 00:01:35.065 compress/isal: not in enabled drivers build config 00:01:35.065 compress/mlx5: not in enabled drivers build config 00:01:35.065 compress/octeontx: not in enabled drivers build config 00:01:35.065 compress/zlib: not in enabled drivers build config 00:01:35.065 regex/*: missing internal dependency, "regexdev" 00:01:35.065 ml/*: missing internal dependency, "mldev" 00:01:35.065 vdpa/ifc: not in enabled drivers build config 00:01:35.065 vdpa/mlx5: not in enabled drivers build config 00:01:35.065 vdpa/nfp: not in enabled drivers build config 00:01:35.065 vdpa/sfc: not in enabled drivers build config 00:01:35.065 event/*: missing internal dependency, "eventdev" 00:01:35.065 baseband/*: missing internal dependency, "bbdev" 00:01:35.065 gpu/*: missing internal dependency, "gpudev" 00:01:35.065 00:01:35.065 00:01:35.065 Build targets in project: 85 00:01:35.065 00:01:35.065 DPDK 23.11.0 00:01:35.065 00:01:35.065 User defined options 00:01:35.065 buildtype : debug 00:01:35.065 default_library : static 00:01:35.065 libdir : lib 00:01:35.065 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:35.065 c_args : -fPIC -Werror 00:01:35.065 c_link_args : 00:01:35.065 cpu_instruction_set: native 00:01:35.065 disable_apps : test-sad,test-acl,test-dma-perf,test-pipeline,test-compress-perf,test-fib,test-flow-perf,test-crypto-perf,test-bbdev,test-eventdev,pdump,test-mldev,test-cmdline,graph,test-security-perf,test-pmd,test,proc-info,test-regex,dumpcap,test-gpudev 00:01:35.065 disable_libs : port,sched,rib,node,ipsec,distributor,gro,eventdev,pdcp,acl,member,latencystats,efd,stack,regexdev,rawdev,bpf,metrics,gpudev,pipeline,pdump,table,fib,dispatcher,mldev,gso,cfgfile,bitratestats,ip_frag,graph,lpm,jobstats,pcapng,bbdev 00:01:35.065 enable_docs : false 00:01:35.065 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:35.065 enable_kmods : false 00:01:35.065 tests : false 00:01:35.065 00:01:35.065 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:35.332 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:35.332 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:35.332 [2/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:35.332 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:35.332 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:35.332 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:35.332 [6/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:35.332 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:35.332 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:35.332 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:35.332 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:35.332 [11/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:35.332 [12/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:35.332 [13/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:35.332 [14/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:35.332 [15/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:35.332 [16/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:35.332 [17/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:35.332 [18/265] Linking static target lib/librte_kvargs.a 00:01:35.332 [19/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:35.332 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:35.332 [21/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:35.332 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:35.332 [23/265] Linking static target lib/librte_log.a 00:01:35.332 [24/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:35.332 [25/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:35.332 [26/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:35.332 [27/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:35.332 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:35.332 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:35.332 [30/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:35.332 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:35.332 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:35.332 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:35.332 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:35.332 [35/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:35.332 [36/265] Linking static target lib/librte_pci.a 00:01:35.590 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:35.590 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:35.590 [39/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:35.590 [40/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:35.590 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:35.590 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.850 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:35.850 [44/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:35.850 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:35.850 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:35.850 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:35.850 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:35.850 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:35.850 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:35.850 [51/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:35.850 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:35.850 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:35.850 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:35.850 [55/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:35.850 [56/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:35.850 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:35.850 [58/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:35.850 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:35.850 [60/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:35.850 [61/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:35.850 [62/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:35.850 [63/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:35.850 [64/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:35.850 [65/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:35.850 [66/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:35.850 [67/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:35.850 [68/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:35.850 [69/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:35.850 [70/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:35.850 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:35.850 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:35.850 [73/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:35.850 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:35.850 [75/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:35.850 [76/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:35.850 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:35.850 [78/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:35.850 [79/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:35.850 [80/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:35.850 [81/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:35.850 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:35.850 [83/265] Linking static target lib/librte_telemetry.a 00:01:35.851 [84/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:35.851 [85/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:35.851 [86/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:35.851 [87/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:35.851 [88/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:35.851 [89/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:35.851 [90/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:35.851 [91/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:35.851 [92/265] Linking static target lib/librte_meter.a 00:01:35.851 [93/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:35.851 [94/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:35.851 [95/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:35.851 [96/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:35.851 [97/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:35.851 [98/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:35.851 [99/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:35.851 [100/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:35.851 [101/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:35.851 [102/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:35.851 [103/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:35.851 [104/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:35.851 [105/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:35.851 [106/265] Linking static target lib/librte_ring.a 00:01:35.851 [107/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:36.110 [108/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:36.110 [109/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:36.110 [110/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:36.110 [111/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:36.110 [112/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:36.110 [113/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:36.110 [114/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:36.110 [115/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:36.110 [116/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.110 [117/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:36.110 [118/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:36.110 [119/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:36.110 [120/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:36.110 [121/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:36.110 [122/265] Linking static target lib/librte_timer.a 00:01:36.110 [123/265] Linking static target lib/librte_cmdline.a 00:01:36.110 [124/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:36.110 [125/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:36.110 [126/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:36.110 [127/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:36.110 [128/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:36.110 [129/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:36.110 [130/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:36.110 [131/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:36.110 [132/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:36.110 [133/265] Linking static target lib/librte_dmadev.a 00:01:36.110 [134/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:36.110 [135/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:36.110 [136/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:36.110 [137/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:36.110 [138/265] Linking static target lib/librte_mempool.a 00:01:36.110 [139/265] Linking static target lib/librte_eal.a 00:01:36.110 [140/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:36.110 [141/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:36.110 [142/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:36.110 [143/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:36.110 [144/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:36.110 [145/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:36.110 [146/265] Linking target lib/librte_log.so.24.0 00:01:36.110 [147/265] Linking static target lib/librte_rcu.a 00:01:36.110 [148/265] Linking static target lib/librte_net.a 00:01:36.110 [149/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:36.110 [150/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:36.110 [151/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:36.110 [152/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:36.110 [153/265] Linking static target lib/librte_mbuf.a 00:01:36.110 [154/265] Linking static target lib/librte_compressdev.a 00:01:36.110 [155/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:36.110 [156/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:36.110 [157/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:36.110 [158/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:36.110 [159/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:36.110 [160/265] Linking static target lib/librte_reorder.a 00:01:36.110 [161/265] Linking static target lib/librte_power.a 00:01:36.110 [162/265] Linking static target lib/librte_hash.a 00:01:36.110 [163/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:36.110 [164/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:36.110 [165/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:36.110 [166/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:36.110 [167/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:36.110 [168/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:36.110 [169/265] Linking static target lib/librte_security.a 00:01:36.110 [170/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.369 [171/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:36.369 [172/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:36.369 [173/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:36.369 [174/265] Linking target lib/librte_kvargs.so.24.0 00:01:36.369 [175/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:36.369 [176/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:36.369 [177/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:36.369 [178/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.369 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:36.369 [180/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:36.369 [181/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:36.369 [182/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:36.369 [183/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:36.369 [184/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:36.369 [185/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:36.369 [186/265] Linking static target lib/librte_cryptodev.a 00:01:36.369 [187/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:36.369 [188/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.369 [189/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:36.369 [190/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:36.369 [191/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:36.369 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:36.369 [193/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:36.369 [194/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.369 [195/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:36.369 [196/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.628 [197/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:36.628 [198/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:36.628 [199/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.628 [200/265] Linking static target drivers/librte_bus_vdev.a 00:01:36.628 [201/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.628 [202/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:36.628 [203/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.628 [204/265] Linking target lib/librte_telemetry.so.24.0 00:01:36.628 [205/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:36.628 [206/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:36.628 [207/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:36.628 [208/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:36.628 [209/265] Linking static target drivers/librte_bus_pci.a 00:01:36.628 [210/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:36.628 [211/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:36.628 [212/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:36.628 [213/265] Linking static target drivers/librte_mempool_ring.a 00:01:36.628 [214/265] Linking static target lib/librte_ethdev.a 00:01:36.628 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:36.887 [216/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.887 [217/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.887 [218/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:36.887 [219/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.145 [220/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.145 [221/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.145 [222/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.403 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:37.403 [224/265] Linking static target lib/librte_vhost.a 00:01:37.403 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:37.403 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:38.779 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:39.714 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.273 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.177 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.435 [231/265] Linking target lib/librte_eal.so.24.0 00:01:48.435 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:01:48.435 [233/265] Linking target lib/librte_dmadev.so.24.0 00:01:48.435 [234/265] Linking target lib/librte_pci.so.24.0 00:01:48.435 [235/265] Linking target lib/librte_ring.so.24.0 00:01:48.435 [236/265] Linking target lib/librte_meter.so.24.0 00:01:48.435 [237/265] Linking target lib/librte_timer.so.24.0 00:01:48.435 [238/265] Linking target drivers/librte_bus_vdev.so.24.0 00:01:48.695 [239/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:01:48.695 [240/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:01:48.695 [241/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:01:48.695 [242/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:01:48.695 [243/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:01:48.695 [244/265] Linking target drivers/librte_bus_pci.so.24.0 00:01:48.695 [245/265] Linking target lib/librte_mempool.so.24.0 00:01:48.695 [246/265] Linking target lib/librte_rcu.so.24.0 00:01:48.955 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:01:48.955 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:01:48.955 [249/265] Linking target lib/librte_mbuf.so.24.0 00:01:48.955 [250/265] Linking target drivers/librte_mempool_ring.so.24.0 00:01:48.955 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:01:49.213 [252/265] Linking target lib/librte_cryptodev.so.24.0 00:01:49.213 [253/265] Linking target lib/librte_compressdev.so.24.0 00:01:49.213 [254/265] Linking target lib/librte_reorder.so.24.0 00:01:49.213 [255/265] Linking target lib/librte_net.so.24.0 00:01:49.213 [256/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:01:49.213 [257/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:01:49.213 [258/265] Linking target lib/librte_hash.so.24.0 00:01:49.471 [259/265] Linking target lib/librte_ethdev.so.24.0 00:01:49.471 [260/265] Linking target lib/librte_security.so.24.0 00:01:49.471 [261/265] Linking target lib/librte_cmdline.so.24.0 00:01:49.471 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:01:49.471 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:01:49.471 [264/265] Linking target lib/librte_power.so.24.0 00:01:49.471 [265/265] Linking target lib/librte_vhost.so.24.0 00:01:49.471 INFO: autodetecting backend as ninja 00:01:49.471 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:01:50.406 CC lib/ut/ut.o 00:01:50.406 CC lib/log/log.o 00:01:50.406 CC lib/log/log_flags.o 00:01:50.406 CC lib/log/log_deprecated.o 00:01:50.406 CC lib/ut_mock/mock.o 00:01:50.665 LIB libspdk_ut.a 00:01:50.665 LIB libspdk_ut_mock.a 00:01:50.665 LIB libspdk_log.a 00:01:50.924 CC lib/dma/dma.o 00:01:50.924 CC lib/ioat/ioat.o 00:01:50.924 CXX lib/trace_parser/trace.o 00:01:50.924 CC lib/util/base64.o 00:01:50.924 CC lib/util/bit_array.o 00:01:50.924 CC lib/util/cpuset.o 00:01:50.924 CC lib/util/crc16.o 00:01:50.924 CC lib/util/crc32.o 00:01:50.924 CC lib/util/crc32c.o 00:01:50.924 CC lib/util/dif.o 00:01:50.924 CC lib/util/crc32_ieee.o 00:01:50.924 CC lib/util/crc64.o 00:01:50.924 CC lib/util/fd.o 00:01:50.924 CC lib/util/file.o 00:01:50.924 CC lib/util/hexlify.o 00:01:50.924 CC lib/util/iov.o 00:01:50.924 CC lib/util/math.o 00:01:50.924 CC lib/util/pipe.o 00:01:50.924 CC lib/util/strerror_tls.o 00:01:50.924 CC lib/util/string.o 00:01:50.924 CC lib/util/uuid.o 00:01:50.924 CC lib/util/fd_group.o 00:01:50.924 CC lib/util/xor.o 00:01:50.924 CC lib/util/zipf.o 00:01:50.924 LIB libspdk_dma.a 00:01:50.924 CC lib/vfio_user/host/vfio_user_pci.o 00:01:50.924 CC lib/vfio_user/host/vfio_user.o 00:01:51.183 LIB libspdk_ioat.a 00:01:51.183 LIB libspdk_vfio_user.a 00:01:51.183 LIB libspdk_util.a 00:01:51.443 LIB libspdk_trace_parser.a 00:01:51.443 CC lib/vmd/vmd.o 00:01:51.443 CC lib/vmd/led.o 00:01:51.443 CC lib/env_dpdk/env.o 00:01:51.443 CC lib/env_dpdk/memory.o 00:01:51.443 CC lib/env_dpdk/pci.o 00:01:51.443 CC lib/env_dpdk/threads.o 00:01:51.443 CC lib/env_dpdk/pci_ioat.o 00:01:51.443 CC lib/env_dpdk/init.o 00:01:51.443 CC lib/env_dpdk/pci_virtio.o 00:01:51.443 CC lib/env_dpdk/pci_event.o 00:01:51.443 CC lib/env_dpdk/pci_vmd.o 00:01:51.443 CC lib/env_dpdk/pci_idxd.o 00:01:51.443 CC lib/env_dpdk/sigbus_handler.o 00:01:51.443 CC lib/env_dpdk/pci_dpdk.o 00:01:51.443 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:51.443 CC lib/idxd/idxd.o 00:01:51.443 CC lib/idxd/idxd_user.o 00:01:51.443 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:51.443 CC lib/idxd/idxd_kernel.o 00:01:51.701 CC lib/conf/conf.o 00:01:51.701 CC lib/rdma/common.o 00:01:51.701 CC lib/rdma/rdma_verbs.o 00:01:51.701 CC lib/json/json_parse.o 00:01:51.701 CC lib/json/json_util.o 00:01:51.701 CC lib/json/json_write.o 00:01:51.701 LIB libspdk_conf.a 00:01:51.701 LIB libspdk_rdma.a 00:01:51.701 LIB libspdk_json.a 00:01:51.960 LIB libspdk_idxd.a 00:01:51.960 LIB libspdk_vmd.a 00:01:51.960 CC lib/jsonrpc/jsonrpc_server.o 00:01:51.960 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:01:51.960 CC lib/jsonrpc/jsonrpc_client.o 00:01:51.960 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:01:52.219 LIB libspdk_jsonrpc.a 00:01:52.485 LIB libspdk_env_dpdk.a 00:01:52.485 CC lib/rpc/rpc.o 00:01:52.745 LIB libspdk_rpc.a 00:01:53.003 CC lib/sock/sock.o 00:01:53.003 CC lib/sock/sock_rpc.o 00:01:53.003 CC lib/notify/notify_rpc.o 00:01:53.003 CC lib/notify/notify.o 00:01:53.003 CC lib/trace/trace.o 00:01:53.003 CC lib/trace/trace_flags.o 00:01:53.003 CC lib/trace/trace_rpc.o 00:01:53.003 LIB libspdk_notify.a 00:01:53.003 LIB libspdk_trace.a 00:01:53.262 LIB libspdk_sock.a 00:01:53.528 CC lib/thread/thread.o 00:01:53.528 CC lib/thread/iobuf.o 00:01:53.528 CC lib/nvme/nvme_ctrlr_cmd.o 00:01:53.528 CC lib/nvme/nvme_ctrlr.o 00:01:53.528 CC lib/nvme/nvme_fabric.o 00:01:53.528 CC lib/nvme/nvme_ns.o 00:01:53.528 CC lib/nvme/nvme_ns_cmd.o 00:01:53.528 CC lib/nvme/nvme_pcie.o 00:01:53.528 CC lib/nvme/nvme_pcie_common.o 00:01:53.528 CC lib/nvme/nvme.o 00:01:53.528 CC lib/nvme/nvme_qpair.o 00:01:53.528 CC lib/nvme/nvme_quirks.o 00:01:53.528 CC lib/nvme/nvme_transport.o 00:01:53.528 CC lib/nvme/nvme_discovery.o 00:01:53.528 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:01:53.528 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:01:53.528 CC lib/nvme/nvme_tcp.o 00:01:53.528 CC lib/nvme/nvme_opal.o 00:01:53.528 CC lib/nvme/nvme_io_msg.o 00:01:53.528 CC lib/nvme/nvme_poll_group.o 00:01:53.528 CC lib/nvme/nvme_zns.o 00:01:53.528 CC lib/nvme/nvme_cuse.o 00:01:53.528 CC lib/nvme/nvme_vfio_user.o 00:01:53.528 CC lib/nvme/nvme_rdma.o 00:01:54.097 LIB libspdk_thread.a 00:01:54.665 CC lib/vfu_tgt/tgt_endpoint.o 00:01:54.665 CC lib/vfu_tgt/tgt_rpc.o 00:01:54.665 CC lib/blob/request.o 00:01:54.665 CC lib/blob/blobstore.o 00:01:54.665 CC lib/blob/zeroes.o 00:01:54.665 CC lib/init/subsystem.o 00:01:54.665 CC lib/init/json_config.o 00:01:54.665 CC lib/blob/blob_bs_dev.o 00:01:54.665 CC lib/init/subsystem_rpc.o 00:01:54.665 CC lib/virtio/virtio.o 00:01:54.665 CC lib/init/rpc.o 00:01:54.665 CC lib/virtio/virtio_vhost_user.o 00:01:54.665 CC lib/virtio/virtio_vfio_user.o 00:01:54.665 CC lib/virtio/virtio_pci.o 00:01:54.665 CC lib/accel/accel.o 00:01:54.665 CC lib/accel/accel_rpc.o 00:01:54.665 CC lib/accel/accel_sw.o 00:01:54.665 LIB libspdk_nvme.a 00:01:54.665 LIB libspdk_init.a 00:01:54.665 LIB libspdk_vfu_tgt.a 00:01:54.665 LIB libspdk_virtio.a 00:01:54.924 CC lib/event/app.o 00:01:54.924 CC lib/event/reactor.o 00:01:54.924 CC lib/event/log_rpc.o 00:01:54.924 CC lib/event/app_rpc.o 00:01:54.924 CC lib/event/scheduler_static.o 00:01:55.184 LIB libspdk_accel.a 00:01:55.184 LIB libspdk_event.a 00:01:55.443 CC lib/bdev/bdev.o 00:01:55.443 CC lib/bdev/part.o 00:01:55.443 CC lib/bdev/bdev_rpc.o 00:01:55.443 CC lib/bdev/bdev_zone.o 00:01:55.443 CC lib/bdev/scsi_nvme.o 00:01:56.011 LIB libspdk_blob.a 00:01:56.269 CC lib/lvol/lvol.o 00:01:56.269 CC lib/blobfs/blobfs.o 00:01:56.269 CC lib/blobfs/tree.o 00:01:56.837 LIB libspdk_lvol.a 00:01:56.837 LIB libspdk_blobfs.a 00:01:57.095 LIB libspdk_bdev.a 00:01:57.353 CC lib/nbd/nbd.o 00:01:57.353 CC lib/nbd/nbd_rpc.o 00:01:57.353 CC lib/ublk/ublk.o 00:01:57.353 CC lib/scsi/dev.o 00:01:57.353 CC lib/scsi/lun.o 00:01:57.353 CC lib/scsi/scsi_bdev.o 00:01:57.353 CC lib/scsi/port.o 00:01:57.353 CC lib/ublk/ublk_rpc.o 00:01:57.353 CC lib/scsi/scsi.o 00:01:57.353 CC lib/scsi/scsi_pr.o 00:01:57.353 CC lib/scsi/scsi_rpc.o 00:01:57.353 CC lib/scsi/task.o 00:01:57.353 CC lib/ftl/ftl_core.o 00:01:57.353 CC lib/ftl/ftl_debug.o 00:01:57.353 CC lib/ftl/ftl_init.o 00:01:57.353 CC lib/ftl/ftl_layout.o 00:01:57.353 CC lib/ftl/ftl_io.o 00:01:57.353 CC lib/ftl/ftl_l2p_flat.o 00:01:57.353 CC lib/ftl/ftl_l2p.o 00:01:57.353 CC lib/ftl/ftl_sb.o 00:01:57.353 CC lib/ftl/ftl_band_ops.o 00:01:57.353 CC lib/ftl/ftl_nv_cache.o 00:01:57.353 CC lib/nvmf/ctrlr.o 00:01:57.353 CC lib/ftl/ftl_band.o 00:01:57.353 CC lib/nvmf/ctrlr_discovery.o 00:01:57.353 CC lib/nvmf/ctrlr_bdev.o 00:01:57.353 CC lib/ftl/ftl_writer.o 00:01:57.353 CC lib/nvmf/subsystem.o 00:01:57.353 CC lib/ftl/ftl_rq.o 00:01:57.353 CC lib/nvmf/nvmf.o 00:01:57.353 CC lib/ftl/ftl_reloc.o 00:01:57.353 CC lib/nvmf/nvmf_rpc.o 00:01:57.353 CC lib/nvmf/transport.o 00:01:57.353 CC lib/ftl/ftl_l2p_cache.o 00:01:57.353 CC lib/nvmf/tcp.o 00:01:57.353 CC lib/ftl/ftl_p2l.o 00:01:57.353 CC lib/nvmf/vfio_user.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt.o 00:01:57.353 CC lib/nvmf/rdma.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_startup.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_md.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_misc.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_band.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:01:57.353 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:01:57.353 CC lib/ftl/utils/ftl_conf.o 00:01:57.353 CC lib/ftl/utils/ftl_md.o 00:01:57.353 CC lib/ftl/utils/ftl_mempool.o 00:01:57.353 CC lib/ftl/utils/ftl_bitmap.o 00:01:57.353 CC lib/ftl/utils/ftl_property.o 00:01:57.353 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:01:57.353 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:01:57.353 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:01:57.353 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:01:57.353 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:01:57.353 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:01:57.353 CC lib/ftl/upgrade/ftl_sb_v3.o 00:01:57.353 CC lib/ftl/upgrade/ftl_sb_v5.o 00:01:57.353 CC lib/ftl/nvc/ftl_nvc_dev.o 00:01:57.353 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:01:57.353 CC lib/ftl/base/ftl_base_dev.o 00:01:57.353 CC lib/ftl/base/ftl_base_bdev.o 00:01:57.353 CC lib/ftl/ftl_trace.o 00:01:57.611 LIB libspdk_nbd.a 00:01:57.869 LIB libspdk_scsi.a 00:01:57.869 LIB libspdk_ublk.a 00:01:58.127 CC lib/vhost/vhost.o 00:01:58.127 CC lib/vhost/vhost_rpc.o 00:01:58.127 CC lib/vhost/vhost_scsi.o 00:01:58.127 CC lib/vhost/vhost_blk.o 00:01:58.127 CC lib/vhost/rte_vhost_user.o 00:01:58.127 CC lib/iscsi/conn.o 00:01:58.127 CC lib/iscsi/init_grp.o 00:01:58.127 CC lib/iscsi/iscsi.o 00:01:58.127 CC lib/iscsi/portal_grp.o 00:01:58.127 CC lib/iscsi/md5.o 00:01:58.127 CC lib/iscsi/param.o 00:01:58.127 CC lib/iscsi/iscsi_rpc.o 00:01:58.127 CC lib/iscsi/tgt_node.o 00:01:58.127 CC lib/iscsi/iscsi_subsystem.o 00:01:58.127 CC lib/iscsi/task.o 00:01:58.127 LIB libspdk_ftl.a 00:01:58.696 LIB libspdk_nvmf.a 00:01:58.696 LIB libspdk_vhost.a 00:01:58.696 LIB libspdk_iscsi.a 00:01:59.263 CC module/env_dpdk/env_dpdk_rpc.o 00:01:59.263 CC module/vfu_device/vfu_virtio.o 00:01:59.263 CC module/vfu_device/vfu_virtio_scsi.o 00:01:59.263 CC module/vfu_device/vfu_virtio_blk.o 00:01:59.263 CC module/vfu_device/vfu_virtio_rpc.o 00:01:59.263 LIB libspdk_env_dpdk_rpc.a 00:01:59.263 CC module/accel/ioat/accel_ioat_rpc.o 00:01:59.263 CC module/accel/ioat/accel_ioat.o 00:01:59.263 CC module/sock/posix/posix.o 00:01:59.263 CC module/scheduler/dynamic/scheduler_dynamic.o 00:01:59.263 CC module/blob/bdev/blob_bdev.o 00:01:59.263 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:01:59.263 CC module/scheduler/gscheduler/gscheduler.o 00:01:59.263 CC module/accel/dsa/accel_dsa_rpc.o 00:01:59.263 CC module/accel/dsa/accel_dsa.o 00:01:59.263 CC module/accel/iaa/accel_iaa.o 00:01:59.263 CC module/accel/iaa/accel_iaa_rpc.o 00:01:59.263 CC module/accel/error/accel_error.o 00:01:59.263 CC module/accel/error/accel_error_rpc.o 00:01:59.521 LIB libspdk_scheduler_dpdk_governor.a 00:01:59.521 LIB libspdk_scheduler_gscheduler.a 00:01:59.521 LIB libspdk_accel_ioat.a 00:01:59.521 LIB libspdk_scheduler_dynamic.a 00:01:59.521 LIB libspdk_accel_error.a 00:01:59.521 LIB libspdk_accel_iaa.a 00:01:59.521 LIB libspdk_blob_bdev.a 00:01:59.521 LIB libspdk_accel_dsa.a 00:01:59.521 LIB libspdk_vfu_device.a 00:01:59.779 LIB libspdk_sock_posix.a 00:02:00.037 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:00.037 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:00.037 CC module/bdev/malloc/bdev_malloc.o 00:02:00.037 CC module/bdev/raid/bdev_raid.o 00:02:00.037 CC module/bdev/passthru/vbdev_passthru.o 00:02:00.037 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:00.037 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:00.037 CC module/bdev/raid/bdev_raid_rpc.o 00:02:00.037 CC module/bdev/raid/raid0.o 00:02:00.037 CC module/bdev/raid/raid1.o 00:02:00.037 CC module/bdev/raid/bdev_raid_sb.o 00:02:00.037 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:00.037 CC module/bdev/raid/concat.o 00:02:00.037 CC module/bdev/error/vbdev_error.o 00:02:00.037 CC module/bdev/gpt/gpt.o 00:02:00.037 CC module/bdev/split/vbdev_split.o 00:02:00.037 CC module/bdev/error/vbdev_error_rpc.o 00:02:00.037 CC module/bdev/split/vbdev_split_rpc.o 00:02:00.037 CC module/bdev/null/bdev_null_rpc.o 00:02:00.037 CC module/bdev/gpt/vbdev_gpt.o 00:02:00.037 CC module/bdev/null/bdev_null.o 00:02:00.037 CC module/blobfs/bdev/blobfs_bdev.o 00:02:00.037 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:00.037 CC module/bdev/aio/bdev_aio.o 00:02:00.037 CC module/bdev/aio/bdev_aio_rpc.o 00:02:00.037 CC module/bdev/nvme/bdev_nvme.o 00:02:00.037 CC module/bdev/nvme/nvme_rpc.o 00:02:00.037 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:00.037 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:00.037 CC module/bdev/nvme/bdev_mdns_client.o 00:02:00.037 CC module/bdev/nvme/vbdev_opal.o 00:02:00.037 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:00.037 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:00.037 CC module/bdev/iscsi/bdev_iscsi.o 00:02:00.037 CC module/bdev/ftl/bdev_ftl.o 00:02:00.037 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:00.037 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:00.037 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:00.037 CC module/bdev/lvol/vbdev_lvol.o 00:02:00.037 CC module/bdev/delay/vbdev_delay.o 00:02:00.037 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:00.037 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:00.037 LIB libspdk_blobfs_bdev.a 00:02:00.037 LIB libspdk_bdev_split.a 00:02:00.037 LIB libspdk_bdev_error.a 00:02:00.037 LIB libspdk_bdev_null.a 00:02:00.037 LIB libspdk_bdev_gpt.a 00:02:00.037 LIB libspdk_bdev_passthru.a 00:02:00.296 LIB libspdk_bdev_ftl.a 00:02:00.296 LIB libspdk_bdev_aio.a 00:02:00.296 LIB libspdk_bdev_malloc.a 00:02:00.296 LIB libspdk_bdev_zone_block.a 00:02:00.296 LIB libspdk_bdev_iscsi.a 00:02:00.296 LIB libspdk_bdev_delay.a 00:02:00.296 LIB libspdk_bdev_virtio.a 00:02:00.296 LIB libspdk_bdev_lvol.a 00:02:00.555 LIB libspdk_bdev_raid.a 00:02:01.123 LIB libspdk_bdev_nvme.a 00:02:01.692 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:01.692 CC module/event/subsystems/sock/sock.o 00:02:01.692 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:01.692 CC module/event/subsystems/iobuf/iobuf.o 00:02:01.692 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:01.692 CC module/event/subsystems/scheduler/scheduler.o 00:02:01.692 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:01.692 CC module/event/subsystems/vmd/vmd.o 00:02:01.692 LIB libspdk_event_sock.a 00:02:01.951 LIB libspdk_event_vhost_blk.a 00:02:01.951 LIB libspdk_event_vfu_tgt.a 00:02:01.951 LIB libspdk_event_vmd.a 00:02:01.951 LIB libspdk_event_scheduler.a 00:02:01.951 LIB libspdk_event_iobuf.a 00:02:02.211 CC module/event/subsystems/accel/accel.o 00:02:02.211 LIB libspdk_event_accel.a 00:02:02.781 CC module/event/subsystems/bdev/bdev.o 00:02:02.781 LIB libspdk_event_bdev.a 00:02:03.040 CC module/event/subsystems/nbd/nbd.o 00:02:03.040 CC module/event/subsystems/scsi/scsi.o 00:02:03.040 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:03.040 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:03.040 CC module/event/subsystems/ublk/ublk.o 00:02:03.040 LIB libspdk_event_nbd.a 00:02:03.040 LIB libspdk_event_scsi.a 00:02:03.040 LIB libspdk_event_ublk.a 00:02:03.299 LIB libspdk_event_nvmf.a 00:02:03.299 CC module/event/subsystems/iscsi/iscsi.o 00:02:03.558 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:03.558 LIB libspdk_event_iscsi.a 00:02:03.558 LIB libspdk_event_vhost_scsi.a 00:02:03.818 CXX app/trace/trace.o 00:02:03.818 CC test/rpc_client/rpc_client_test.o 00:02:03.818 CC app/spdk_nvme_perf/perf.o 00:02:03.818 CC app/spdk_lspci/spdk_lspci.o 00:02:03.818 CC app/trace_record/trace_record.o 00:02:03.818 CC app/spdk_nvme_identify/identify.o 00:02:03.818 CC app/spdk_top/spdk_top.o 00:02:03.818 CC app/spdk_nvme_discover/discovery_aer.o 00:02:03.818 TEST_HEADER include/spdk/accel_module.h 00:02:03.818 TEST_HEADER include/spdk/accel.h 00:02:03.818 TEST_HEADER include/spdk/assert.h 00:02:03.818 TEST_HEADER include/spdk/barrier.h 00:02:03.818 TEST_HEADER include/spdk/bdev.h 00:02:03.818 TEST_HEADER include/spdk/base64.h 00:02:03.818 TEST_HEADER include/spdk/bdev_zone.h 00:02:03.818 TEST_HEADER include/spdk/bdev_module.h 00:02:03.818 TEST_HEADER include/spdk/bit_array.h 00:02:03.818 TEST_HEADER include/spdk/bit_pool.h 00:02:03.818 TEST_HEADER include/spdk/blob_bdev.h 00:02:03.818 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:03.818 TEST_HEADER include/spdk/blob.h 00:02:03.818 TEST_HEADER include/spdk/blobfs.h 00:02:03.818 TEST_HEADER include/spdk/conf.h 00:02:03.818 TEST_HEADER include/spdk/config.h 00:02:03.818 TEST_HEADER include/spdk/crc16.h 00:02:03.818 TEST_HEADER include/spdk/cpuset.h 00:02:03.818 TEST_HEADER include/spdk/crc32.h 00:02:03.818 TEST_HEADER include/spdk/crc64.h 00:02:03.818 TEST_HEADER include/spdk/dif.h 00:02:03.818 TEST_HEADER include/spdk/endian.h 00:02:03.818 TEST_HEADER include/spdk/dma.h 00:02:03.818 TEST_HEADER include/spdk/env_dpdk.h 00:02:03.818 TEST_HEADER include/spdk/env.h 00:02:03.818 TEST_HEADER include/spdk/event.h 00:02:03.818 TEST_HEADER include/spdk/fd_group.h 00:02:03.818 TEST_HEADER include/spdk/fd.h 00:02:03.818 TEST_HEADER include/spdk/file.h 00:02:03.818 TEST_HEADER include/spdk/ftl.h 00:02:03.818 TEST_HEADER include/spdk/gpt_spec.h 00:02:03.818 TEST_HEADER include/spdk/histogram_data.h 00:02:03.818 TEST_HEADER include/spdk/hexlify.h 00:02:03.818 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:03.818 TEST_HEADER include/spdk/idxd.h 00:02:03.818 TEST_HEADER include/spdk/idxd_spec.h 00:02:03.818 TEST_HEADER include/spdk/ioat.h 00:02:03.818 TEST_HEADER include/spdk/init.h 00:02:03.818 TEST_HEADER include/spdk/ioat_spec.h 00:02:03.818 TEST_HEADER include/spdk/iscsi_spec.h 00:02:03.818 TEST_HEADER include/spdk/jsonrpc.h 00:02:03.818 TEST_HEADER include/spdk/json.h 00:02:03.818 TEST_HEADER include/spdk/likely.h 00:02:03.818 TEST_HEADER include/spdk/memory.h 00:02:03.818 TEST_HEADER include/spdk/log.h 00:02:03.818 TEST_HEADER include/spdk/lvol.h 00:02:03.818 TEST_HEADER include/spdk/mmio.h 00:02:03.818 TEST_HEADER include/spdk/nbd.h 00:02:03.818 CC app/spdk_dd/spdk_dd.o 00:02:03.818 TEST_HEADER include/spdk/notify.h 00:02:03.818 TEST_HEADER include/spdk/nvme.h 00:02:03.818 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:03.818 TEST_HEADER include/spdk/nvme_intel.h 00:02:03.818 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:03.818 TEST_HEADER include/spdk/nvme_spec.h 00:02:03.818 TEST_HEADER include/spdk/nvme_zns.h 00:02:03.818 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:03.818 CC app/nvmf_tgt/nvmf_main.o 00:02:03.818 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:03.818 TEST_HEADER include/spdk/nvmf.h 00:02:03.818 TEST_HEADER include/spdk/nvmf_spec.h 00:02:03.818 TEST_HEADER include/spdk/nvmf_transport.h 00:02:03.818 TEST_HEADER include/spdk/opal.h 00:02:03.818 TEST_HEADER include/spdk/opal_spec.h 00:02:03.818 TEST_HEADER include/spdk/pci_ids.h 00:02:03.818 TEST_HEADER include/spdk/pipe.h 00:02:03.818 TEST_HEADER include/spdk/reduce.h 00:02:03.818 TEST_HEADER include/spdk/queue.h 00:02:03.818 TEST_HEADER include/spdk/rpc.h 00:02:03.818 TEST_HEADER include/spdk/scheduler.h 00:02:03.818 CC app/vhost/vhost.o 00:02:03.818 TEST_HEADER include/spdk/scsi.h 00:02:03.818 CC app/spdk_tgt/spdk_tgt.o 00:02:03.818 TEST_HEADER include/spdk/scsi_spec.h 00:02:03.818 TEST_HEADER include/spdk/sock.h 00:02:03.818 TEST_HEADER include/spdk/stdinc.h 00:02:03.818 TEST_HEADER include/spdk/string.h 00:02:03.818 CC app/iscsi_tgt/iscsi_tgt.o 00:02:03.818 TEST_HEADER include/spdk/thread.h 00:02:03.818 TEST_HEADER include/spdk/trace.h 00:02:03.818 TEST_HEADER include/spdk/tree.h 00:02:03.818 TEST_HEADER include/spdk/trace_parser.h 00:02:03.818 TEST_HEADER include/spdk/ublk.h 00:02:03.818 TEST_HEADER include/spdk/util.h 00:02:03.818 TEST_HEADER include/spdk/version.h 00:02:03.818 TEST_HEADER include/spdk/uuid.h 00:02:03.818 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:03.818 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:03.818 TEST_HEADER include/spdk/vhost.h 00:02:03.818 TEST_HEADER include/spdk/vmd.h 00:02:03.818 TEST_HEADER include/spdk/xor.h 00:02:03.818 TEST_HEADER include/spdk/zipf.h 00:02:03.818 CXX test/cpp_headers/accel_module.o 00:02:03.818 CXX test/cpp_headers/accel.o 00:02:03.818 CXX test/cpp_headers/assert.o 00:02:03.818 CXX test/cpp_headers/barrier.o 00:02:03.818 CXX test/cpp_headers/base64.o 00:02:03.818 CXX test/cpp_headers/bdev.o 00:02:04.081 CXX test/cpp_headers/bdev_module.o 00:02:04.081 CXX test/cpp_headers/bdev_zone.o 00:02:04.081 CXX test/cpp_headers/blob_bdev.o 00:02:04.081 CXX test/cpp_headers/bit_array.o 00:02:04.081 CXX test/cpp_headers/bit_pool.o 00:02:04.081 CC test/app/stub/stub.o 00:02:04.081 CC test/app/histogram_perf/histogram_perf.o 00:02:04.081 CXX test/cpp_headers/blobfs_bdev.o 00:02:04.081 CXX test/cpp_headers/blob.o 00:02:04.081 CXX test/cpp_headers/conf.o 00:02:04.081 CXX test/cpp_headers/blobfs.o 00:02:04.081 CXX test/cpp_headers/config.o 00:02:04.081 CXX test/cpp_headers/crc16.o 00:02:04.081 CXX test/cpp_headers/cpuset.o 00:02:04.081 CXX test/cpp_headers/crc32.o 00:02:04.081 CXX test/cpp_headers/dif.o 00:02:04.081 CXX test/cpp_headers/crc64.o 00:02:04.081 CXX test/cpp_headers/dma.o 00:02:04.081 CXX test/cpp_headers/endian.o 00:02:04.081 CXX test/cpp_headers/env_dpdk.o 00:02:04.081 CXX test/cpp_headers/fd_group.o 00:02:04.081 CXX test/cpp_headers/env.o 00:02:04.081 CXX test/cpp_headers/event.o 00:02:04.081 CXX test/cpp_headers/file.o 00:02:04.081 CXX test/cpp_headers/fd.o 00:02:04.081 CXX test/cpp_headers/gpt_spec.o 00:02:04.081 CXX test/cpp_headers/ftl.o 00:02:04.081 CXX test/cpp_headers/idxd.o 00:02:04.081 CXX test/cpp_headers/hexlify.o 00:02:04.081 CXX test/cpp_headers/histogram_data.o 00:02:04.081 CXX test/cpp_headers/idxd_spec.o 00:02:04.081 CC test/nvme/overhead/overhead.o 00:02:04.081 CC test/nvme/aer/aer.o 00:02:04.081 CC test/app/jsoncat/jsoncat.o 00:02:04.081 CC test/nvme/boot_partition/boot_partition.o 00:02:04.081 CC test/nvme/reserve/reserve.o 00:02:04.081 CC test/nvme/err_injection/err_injection.o 00:02:04.081 CC test/nvme/simple_copy/simple_copy.o 00:02:04.081 CC test/nvme/fused_ordering/fused_ordering.o 00:02:04.081 CC test/nvme/e2edp/nvme_dp.o 00:02:04.081 CC test/nvme/sgl/sgl.o 00:02:04.081 CC test/env/pci/pci_ut.o 00:02:04.081 CC test/thread/lock/spdk_lock.o 00:02:04.081 CC app/fio/nvme/fio_plugin.o 00:02:04.081 CC test/nvme/connect_stress/connect_stress.o 00:02:04.081 CC examples/nvme/arbitration/arbitration.o 00:02:04.081 CC test/nvme/startup/startup.o 00:02:04.081 CC examples/nvme/hello_world/hello_world.o 00:02:04.081 CC test/dma/test_dma/test_dma.o 00:02:04.081 CC test/event/app_repeat/app_repeat.o 00:02:04.081 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:04.081 CC test/nvme/fdp/fdp.o 00:02:04.081 CC test/event/reactor/reactor.o 00:02:04.081 CC test/nvme/cuse/cuse.o 00:02:04.081 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:04.081 CC test/thread/poller_perf/poller_perf.o 00:02:04.081 CC test/event/reactor_perf/reactor_perf.o 00:02:04.081 CC test/env/memory/memory_ut.o 00:02:04.081 CC test/event/event_perf/event_perf.o 00:02:04.081 CC examples/nvme/hotplug/hotplug.o 00:02:04.081 CC examples/accel/perf/accel_perf.o 00:02:04.081 CC test/nvme/reset/reset.o 00:02:04.081 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:04.081 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:04.081 CC examples/sock/hello_world/hello_sock.o 00:02:04.081 CC examples/ioat/perf/perf.o 00:02:04.081 CC test/env/vtophys/vtophys.o 00:02:04.082 CC test/accel/dif/dif.o 00:02:04.082 CC examples/nvme/reconnect/reconnect.o 00:02:04.082 CC examples/nvme/abort/abort.o 00:02:04.082 CC examples/util/zipf/zipf.o 00:02:04.082 CXX test/cpp_headers/init.o 00:02:04.082 CC test/nvme/compliance/nvme_compliance.o 00:02:04.082 CC test/bdev/bdevio/bdevio.o 00:02:04.082 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:04.082 LINK spdk_lspci 00:02:04.082 CC test/app/bdev_svc/bdev_svc.o 00:02:04.082 CC examples/ioat/verify/verify.o 00:02:04.082 CC examples/vmd/lsvmd/lsvmd.o 00:02:04.082 CC examples/idxd/perf/perf.o 00:02:04.082 CC examples/vmd/led/led.o 00:02:04.082 CC test/blobfs/mkfs/mkfs.o 00:02:04.082 CC examples/blob/hello_world/hello_blob.o 00:02:04.082 CC test/event/scheduler/scheduler.o 00:02:04.082 CC app/fio/bdev/fio_plugin.o 00:02:04.082 CC examples/blob/cli/blobcli.o 00:02:04.082 CC examples/nvmf/nvmf/nvmf.o 00:02:04.082 LINK rpc_client_test 00:02:04.082 CC examples/bdev/bdevperf/bdevperf.o 00:02:04.082 CC examples/bdev/hello_world/hello_bdev.o 00:02:04.082 CC examples/thread/thread/thread_ex.o 00:02:04.082 CC test/env/mem_callbacks/mem_callbacks.o 00:02:04.082 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:04.082 CC test/lvol/esnap/esnap.o 00:02:04.082 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:04.082 LINK spdk_nvme_discover 00:02:04.082 LINK histogram_perf 00:02:04.082 LINK interrupt_tgt 00:02:04.082 LINK nvmf_tgt 00:02:04.082 LINK spdk_trace_record 00:02:04.082 CXX test/cpp_headers/ioat.o 00:02:04.082 LINK jsoncat 00:02:04.082 CXX test/cpp_headers/ioat_spec.o 00:02:04.082 CXX test/cpp_headers/iscsi_spec.o 00:02:04.082 LINK vhost 00:02:04.082 CXX test/cpp_headers/json.o 00:02:04.082 LINK stub 00:02:04.082 CXX test/cpp_headers/jsonrpc.o 00:02:04.082 CXX test/cpp_headers/likely.o 00:02:04.082 CXX test/cpp_headers/log.o 00:02:04.082 CXX test/cpp_headers/lvol.o 00:02:04.082 CXX test/cpp_headers/memory.o 00:02:04.082 CXX test/cpp_headers/mmio.o 00:02:04.082 CXX test/cpp_headers/nbd.o 00:02:04.082 LINK poller_perf 00:02:04.082 LINK reactor 00:02:04.082 LINK reactor_perf 00:02:04.082 CXX test/cpp_headers/notify.o 00:02:04.082 CXX test/cpp_headers/nvme.o 00:02:04.082 CXX test/cpp_headers/nvme_intel.o 00:02:04.082 CXX test/cpp_headers/nvme_ocssd.o 00:02:04.082 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:04.082 CXX test/cpp_headers/nvme_spec.o 00:02:04.082 CXX test/cpp_headers/nvme_zns.o 00:02:04.082 CXX test/cpp_headers/nvmf_cmd.o 00:02:04.082 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:04.082 CXX test/cpp_headers/nvmf.o 00:02:04.082 LINK app_repeat 00:02:04.082 CXX test/cpp_headers/nvmf_spec.o 00:02:04.346 LINK lsvmd 00:02:04.346 CXX test/cpp_headers/nvmf_transport.o 00:02:04.346 LINK event_perf 00:02:04.346 LINK vtophys 00:02:04.346 CXX test/cpp_headers/opal.o 00:02:04.346 CXX test/cpp_headers/opal_spec.o 00:02:04.346 CXX test/cpp_headers/pci_ids.o 00:02:04.346 CXX test/cpp_headers/pipe.o 00:02:04.346 CXX test/cpp_headers/queue.o 00:02:04.346 LINK boot_partition 00:02:04.346 LINK spdk_tgt 00:02:04.346 LINK zipf 00:02:04.346 CXX test/cpp_headers/reduce.o 00:02:04.346 LINK env_dpdk_post_init 00:02:04.346 LINK led 00:02:04.346 LINK startup 00:02:04.346 LINK reserve 00:02:04.346 CXX test/cpp_headers/rpc.o 00:02:04.346 LINK connect_stress 00:02:04.346 CXX test/cpp_headers/scheduler.o 00:02:04.346 LINK iscsi_tgt 00:02:04.346 LINK err_injection 00:02:04.346 LINK fused_ordering 00:02:04.346 CXX test/cpp_headers/scsi.o 00:02:04.346 CXX test/cpp_headers/scsi_spec.o 00:02:04.346 LINK doorbell_aers 00:02:04.346 LINK pmr_persistence 00:02:04.346 LINK bdev_svc 00:02:04.346 CXX test/cpp_headers/sock.o 00:02:04.346 LINK simple_copy 00:02:04.346 LINK verify 00:02:04.346 LINK cmb_copy 00:02:04.346 LINK hello_world 00:02:04.346 LINK hotplug 00:02:04.346 LINK ioat_perf 00:02:04.346 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:04.346 LINK hello_sock 00:02:04.346 LINK mkfs 00:02:04.346 LINK aer 00:02:04.346 LINK sgl 00:02:04.346 LINK nvme_dp 00:02:04.346 LINK fdp 00:02:04.346 LINK reset 00:02:04.346 LINK overhead 00:02:04.346 LINK spdk_trace 00:02:04.346 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:04.346 LINK scheduler 00:02:04.346 LINK hello_blob 00:02:04.346 LINK hello_bdev 00:02:04.346 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:04.346 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:04.346 CXX test/cpp_headers/stdinc.o 00:02:04.346 CXX test/cpp_headers/string.o 00:02:04.346 CXX test/cpp_headers/thread.o 00:02:04.346 CXX test/cpp_headers/trace.o 00:02:04.346 CXX test/cpp_headers/trace_parser.o 00:02:04.346 CXX test/cpp_headers/tree.o 00:02:04.346 LINK thread 00:02:04.346 CXX test/cpp_headers/ublk.o 00:02:04.346 CXX test/cpp_headers/util.o 00:02:04.346 CXX test/cpp_headers/uuid.o 00:02:04.346 CXX test/cpp_headers/version.o 00:02:04.346 CXX test/cpp_headers/vfio_user_pci.o 00:02:04.346 CXX test/cpp_headers/vfio_user_spec.o 00:02:04.346 CXX test/cpp_headers/vhost.o 00:02:04.346 CXX test/cpp_headers/vmd.o 00:02:04.346 CXX test/cpp_headers/xor.o 00:02:04.346 CXX test/cpp_headers/zipf.o 00:02:04.346 LINK nvmf 00:02:04.346 LINK test_dma 00:02:04.606 LINK idxd_perf 00:02:04.606 LINK reconnect 00:02:04.606 LINK arbitration 00:02:04.606 LINK abort 00:02:04.606 LINK bdevio 00:02:04.606 LINK dif 00:02:04.606 LINK spdk_dd 00:02:04.606 LINK nvme_compliance 00:02:04.606 LINK pci_ut 00:02:04.606 LINK nvme_manage 00:02:04.606 LINK accel_perf 00:02:04.606 LINK nvme_fuzz 00:02:04.606 LINK blobcli 00:02:04.900 LINK mem_callbacks 00:02:04.900 LINK spdk_nvme 00:02:04.900 LINK spdk_nvme_identify 00:02:04.900 LINK llvm_vfio_fuzz 00:02:04.900 LINK vhost_fuzz 00:02:04.900 LINK spdk_bdev 00:02:04.900 LINK spdk_nvme_perf 00:02:05.184 LINK bdevperf 00:02:05.184 LINK spdk_top 00:02:05.184 LINK memory_ut 00:02:05.184 LINK llvm_nvme_fuzz 00:02:05.184 LINK cuse 00:02:05.470 LINK spdk_lock 00:02:05.470 LINK iscsi_fuzz 00:02:07.430 LINK esnap 00:02:07.689 00:02:07.689 real 0m40.894s 00:02:07.689 user 5m43.801s 00:02:07.689 sys 2m52.792s 00:02:07.689 09:25:30 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:07.689 09:25:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:07.690 ************************************ 00:02:07.690 END TEST make 00:02:07.690 ************************************ 00:02:07.949 09:25:30 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:07.949 09:25:30 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:07.949 09:25:30 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:07.949 09:25:30 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:07.949 09:25:30 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:07.949 09:25:30 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:07.950 09:25:30 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:07.950 09:25:30 -- scripts/common.sh@335 -- # IFS=.-: 00:02:07.950 09:25:30 -- scripts/common.sh@335 -- # read -ra ver1 00:02:07.950 09:25:30 -- scripts/common.sh@336 -- # IFS=.-: 00:02:07.950 09:25:30 -- scripts/common.sh@336 -- # read -ra ver2 00:02:07.950 09:25:30 -- scripts/common.sh@337 -- # local 'op=<' 00:02:07.950 09:25:30 -- scripts/common.sh@339 -- # ver1_l=2 00:02:07.950 09:25:30 -- scripts/common.sh@340 -- # ver2_l=1 00:02:07.950 09:25:30 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:07.950 09:25:30 -- scripts/common.sh@343 -- # case "$op" in 00:02:07.950 09:25:30 -- scripts/common.sh@344 -- # : 1 00:02:07.950 09:25:30 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:07.950 09:25:30 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:07.950 09:25:30 -- scripts/common.sh@364 -- # decimal 1 00:02:07.950 09:25:30 -- scripts/common.sh@352 -- # local d=1 00:02:07.950 09:25:30 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:07.950 09:25:30 -- scripts/common.sh@354 -- # echo 1 00:02:07.950 09:25:30 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:07.950 09:25:30 -- scripts/common.sh@365 -- # decimal 2 00:02:07.950 09:25:30 -- scripts/common.sh@352 -- # local d=2 00:02:07.950 09:25:30 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:07.950 09:25:30 -- scripts/common.sh@354 -- # echo 2 00:02:07.950 09:25:30 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:07.950 09:25:30 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:07.950 09:25:30 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:07.950 09:25:30 -- scripts/common.sh@367 -- # return 0 00:02:07.950 09:25:30 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:07.950 09:25:30 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:07.950 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:07.950 --rc genhtml_branch_coverage=1 00:02:07.950 --rc genhtml_function_coverage=1 00:02:07.950 --rc genhtml_legend=1 00:02:07.950 --rc geninfo_all_blocks=1 00:02:07.950 --rc geninfo_unexecuted_blocks=1 00:02:07.950 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:07.950 ' 00:02:07.950 09:25:30 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:07.950 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:07.950 --rc genhtml_branch_coverage=1 00:02:07.950 --rc genhtml_function_coverage=1 00:02:07.950 --rc genhtml_legend=1 00:02:07.950 --rc geninfo_all_blocks=1 00:02:07.950 --rc geninfo_unexecuted_blocks=1 00:02:07.950 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:07.950 ' 00:02:07.950 09:25:30 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:07.950 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:07.950 --rc genhtml_branch_coverage=1 00:02:07.950 --rc genhtml_function_coverage=1 00:02:07.950 --rc genhtml_legend=1 00:02:07.950 --rc geninfo_all_blocks=1 00:02:07.950 --rc geninfo_unexecuted_blocks=1 00:02:07.950 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:07.950 ' 00:02:07.950 09:25:30 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:07.950 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:07.950 --rc genhtml_branch_coverage=1 00:02:07.950 --rc genhtml_function_coverage=1 00:02:07.950 --rc genhtml_legend=1 00:02:07.950 --rc geninfo_all_blocks=1 00:02:07.950 --rc geninfo_unexecuted_blocks=1 00:02:07.950 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:07.950 ' 00:02:07.950 09:25:30 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:07.950 09:25:30 -- nvmf/common.sh@7 -- # uname -s 00:02:07.950 09:25:30 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:07.950 09:25:30 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:07.950 09:25:30 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:07.950 09:25:30 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:07.950 09:25:30 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:07.950 09:25:30 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:07.950 09:25:30 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:07.950 09:25:30 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:07.950 09:25:30 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:07.950 09:25:30 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:07.950 09:25:30 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:07.950 09:25:30 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:07.950 09:25:30 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:07.950 09:25:30 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:07.950 09:25:30 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:07.950 09:25:30 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:07.950 09:25:30 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:07.950 09:25:30 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:07.950 09:25:30 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:07.950 09:25:30 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:07.950 09:25:30 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:07.950 09:25:30 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:07.950 09:25:30 -- paths/export.sh@5 -- # export PATH 00:02:07.950 09:25:30 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:07.950 09:25:30 -- nvmf/common.sh@46 -- # : 0 00:02:07.950 09:25:30 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:07.950 09:25:30 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:07.950 09:25:30 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:07.950 09:25:30 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:07.950 09:25:30 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:07.950 09:25:30 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:07.950 09:25:30 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:07.950 09:25:30 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:07.950 09:25:30 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:07.950 09:25:30 -- spdk/autotest.sh@32 -- # uname -s 00:02:07.950 09:25:30 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:07.950 09:25:30 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:07.950 09:25:30 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:07.950 09:25:30 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:07.950 09:25:30 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:07.950 09:25:30 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:07.950 09:25:30 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:07.950 09:25:30 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:07.950 09:25:30 -- spdk/autotest.sh@48 -- # udevadm_pid=3104648 00:02:07.950 09:25:30 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:07.950 09:25:30 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:07.950 09:25:30 -- spdk/autotest.sh@54 -- # echo 3104650 00:02:07.950 09:25:30 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:07.950 09:25:30 -- spdk/autotest.sh@56 -- # echo 3104651 00:02:07.950 09:25:30 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:07.950 09:25:30 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:07.950 09:25:30 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:07.950 09:25:30 -- spdk/autotest.sh@60 -- # echo 3104652 00:02:07.950 09:25:30 -- spdk/autotest.sh@62 -- # echo 3104653 00:02:07.950 09:25:30 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:07.950 09:25:30 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:07.950 09:25:30 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:07.950 09:25:30 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:07.950 09:25:30 -- common/autotest_common.sh@10 -- # set +x 00:02:07.950 09:25:30 -- spdk/autotest.sh@70 -- # create_test_list 00:02:07.950 09:25:30 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:07.950 09:25:30 -- common/autotest_common.sh@10 -- # set +x 00:02:07.950 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:08.210 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:08.210 09:25:30 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:08.210 09:25:30 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:08.210 09:25:30 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:08.210 09:25:30 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:08.210 09:25:30 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:08.210 09:25:30 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:08.210 09:25:30 -- common/autotest_common.sh@1450 -- # uname 00:02:08.210 09:25:30 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:02:08.210 09:25:30 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:08.210 09:25:30 -- common/autotest_common.sh@1470 -- # uname 00:02:08.210 09:25:30 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:02:08.210 09:25:30 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:02:08.210 09:25:30 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:02:08.210 lcov: LCOV version 1.15 00:02:08.210 09:25:30 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:02:10.118 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:02:10.118 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:02:10.118 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:02:22.339 09:25:45 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:02:22.339 09:25:45 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:22.339 09:25:45 -- common/autotest_common.sh@10 -- # set +x 00:02:22.339 09:25:45 -- spdk/autotest.sh@89 -- # rm -f 00:02:22.339 09:25:45 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:25.632 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:25.632 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:25.632 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:25.893 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:26.153 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:26.153 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:26.153 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:26.153 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:02:26.153 09:25:48 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:02:26.153 09:25:48 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:26.153 09:25:48 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:26.153 09:25:48 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:26.153 09:25:48 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:26.153 09:25:48 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:26.153 09:25:48 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:26.153 09:25:48 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:26.153 09:25:48 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:26.153 09:25:48 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:02:26.153 09:25:48 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:02:26.153 09:25:48 -- spdk/autotest.sh@108 -- # grep -v p 00:02:26.153 09:25:48 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:02:26.153 09:25:48 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:02:26.153 09:25:48 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:02:26.153 09:25:48 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:02:26.153 09:25:48 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:26.153 No valid GPT data, bailing 00:02:26.153 09:25:48 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:26.153 09:25:48 -- scripts/common.sh@393 -- # pt= 00:02:26.153 09:25:48 -- scripts/common.sh@394 -- # return 1 00:02:26.153 09:25:48 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:26.153 1+0 records in 00:02:26.153 1+0 records out 00:02:26.153 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00493679 s, 212 MB/s 00:02:26.153 09:25:48 -- spdk/autotest.sh@116 -- # sync 00:02:26.153 09:25:48 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:26.153 09:25:48 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:26.153 09:25:48 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:34.278 09:25:56 -- spdk/autotest.sh@122 -- # uname -s 00:02:34.278 09:25:56 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:02:34.278 09:25:56 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:34.278 09:25:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:34.278 09:25:56 -- common/autotest_common.sh@10 -- # set +x 00:02:34.278 ************************************ 00:02:34.278 START TEST setup.sh 00:02:34.278 ************************************ 00:02:34.278 09:25:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:34.278 * Looking for test storage... 00:02:34.278 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:34.278 09:25:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:34.278 09:25:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:34.278 09:25:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:34.278 09:25:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:34.278 09:25:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:34.278 09:25:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:34.278 09:25:56 -- scripts/common.sh@335 -- # IFS=.-: 00:02:34.278 09:25:56 -- scripts/common.sh@335 -- # read -ra ver1 00:02:34.278 09:25:56 -- scripts/common.sh@336 -- # IFS=.-: 00:02:34.278 09:25:56 -- scripts/common.sh@336 -- # read -ra ver2 00:02:34.278 09:25:56 -- scripts/common.sh@337 -- # local 'op=<' 00:02:34.278 09:25:56 -- scripts/common.sh@339 -- # ver1_l=2 00:02:34.278 09:25:56 -- scripts/common.sh@340 -- # ver2_l=1 00:02:34.278 09:25:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:34.278 09:25:56 -- scripts/common.sh@343 -- # case "$op" in 00:02:34.278 09:25:56 -- scripts/common.sh@344 -- # : 1 00:02:34.278 09:25:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:34.278 09:25:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:34.278 09:25:56 -- scripts/common.sh@364 -- # decimal 1 00:02:34.278 09:25:56 -- scripts/common.sh@352 -- # local d=1 00:02:34.278 09:25:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:34.278 09:25:56 -- scripts/common.sh@354 -- # echo 1 00:02:34.278 09:25:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:34.278 09:25:56 -- scripts/common.sh@365 -- # decimal 2 00:02:34.278 09:25:56 -- scripts/common.sh@352 -- # local d=2 00:02:34.278 09:25:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:34.278 09:25:56 -- scripts/common.sh@354 -- # echo 2 00:02:34.278 09:25:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:34.278 09:25:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:34.278 09:25:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:34.278 09:25:56 -- scripts/common.sh@367 -- # return 0 00:02:34.278 09:25:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:34.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.278 --rc genhtml_branch_coverage=1 00:02:34.278 --rc genhtml_function_coverage=1 00:02:34.278 --rc genhtml_legend=1 00:02:34.278 --rc geninfo_all_blocks=1 00:02:34.278 --rc geninfo_unexecuted_blocks=1 00:02:34.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.278 ' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:34.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.278 --rc genhtml_branch_coverage=1 00:02:34.278 --rc genhtml_function_coverage=1 00:02:34.278 --rc genhtml_legend=1 00:02:34.278 --rc geninfo_all_blocks=1 00:02:34.278 --rc geninfo_unexecuted_blocks=1 00:02:34.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.278 ' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:34.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.278 --rc genhtml_branch_coverage=1 00:02:34.278 --rc genhtml_function_coverage=1 00:02:34.278 --rc genhtml_legend=1 00:02:34.278 --rc geninfo_all_blocks=1 00:02:34.278 --rc geninfo_unexecuted_blocks=1 00:02:34.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.278 ' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:34.278 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.278 --rc genhtml_branch_coverage=1 00:02:34.278 --rc genhtml_function_coverage=1 00:02:34.278 --rc genhtml_legend=1 00:02:34.278 --rc geninfo_all_blocks=1 00:02:34.278 --rc geninfo_unexecuted_blocks=1 00:02:34.278 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.278 ' 00:02:34.278 09:25:56 -- setup/test-setup.sh@10 -- # uname -s 00:02:34.278 09:25:56 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:34.278 09:25:56 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:34.278 09:25:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:34.278 09:25:56 -- common/autotest_common.sh@10 -- # set +x 00:02:34.278 ************************************ 00:02:34.278 START TEST acl 00:02:34.278 ************************************ 00:02:34.278 09:25:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:34.278 * Looking for test storage... 00:02:34.278 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:34.278 09:25:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:34.278 09:25:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:34.278 09:25:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:34.278 09:25:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:34.278 09:25:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:34.278 09:25:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:34.278 09:25:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:34.278 09:25:56 -- scripts/common.sh@335 -- # IFS=.-: 00:02:34.278 09:25:56 -- scripts/common.sh@335 -- # read -ra ver1 00:02:34.278 09:25:56 -- scripts/common.sh@336 -- # IFS=.-: 00:02:34.278 09:25:56 -- scripts/common.sh@336 -- # read -ra ver2 00:02:34.278 09:25:56 -- scripts/common.sh@337 -- # local 'op=<' 00:02:34.278 09:25:56 -- scripts/common.sh@339 -- # ver1_l=2 00:02:34.278 09:25:56 -- scripts/common.sh@340 -- # ver2_l=1 00:02:34.278 09:25:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:34.278 09:25:56 -- scripts/common.sh@343 -- # case "$op" in 00:02:34.278 09:25:56 -- scripts/common.sh@344 -- # : 1 00:02:34.278 09:25:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:34.278 09:25:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:34.278 09:25:56 -- scripts/common.sh@364 -- # decimal 1 00:02:34.279 09:25:56 -- scripts/common.sh@352 -- # local d=1 00:02:34.279 09:25:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:34.279 09:25:56 -- scripts/common.sh@354 -- # echo 1 00:02:34.279 09:25:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:34.279 09:25:56 -- scripts/common.sh@365 -- # decimal 2 00:02:34.279 09:25:56 -- scripts/common.sh@352 -- # local d=2 00:02:34.279 09:25:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:34.279 09:25:56 -- scripts/common.sh@354 -- # echo 2 00:02:34.279 09:25:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:34.279 09:25:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:34.279 09:25:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:34.279 09:25:56 -- scripts/common.sh@367 -- # return 0 00:02:34.279 09:25:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:34.279 09:25:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:34.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.279 --rc genhtml_branch_coverage=1 00:02:34.279 --rc genhtml_function_coverage=1 00:02:34.279 --rc genhtml_legend=1 00:02:34.279 --rc geninfo_all_blocks=1 00:02:34.279 --rc geninfo_unexecuted_blocks=1 00:02:34.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.279 ' 00:02:34.279 09:25:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:34.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.279 --rc genhtml_branch_coverage=1 00:02:34.279 --rc genhtml_function_coverage=1 00:02:34.279 --rc genhtml_legend=1 00:02:34.279 --rc geninfo_all_blocks=1 00:02:34.279 --rc geninfo_unexecuted_blocks=1 00:02:34.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.279 ' 00:02:34.279 09:25:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:34.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.279 --rc genhtml_branch_coverage=1 00:02:34.279 --rc genhtml_function_coverage=1 00:02:34.279 --rc genhtml_legend=1 00:02:34.279 --rc geninfo_all_blocks=1 00:02:34.279 --rc geninfo_unexecuted_blocks=1 00:02:34.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.279 ' 00:02:34.279 09:25:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:34.279 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:34.279 --rc genhtml_branch_coverage=1 00:02:34.279 --rc genhtml_function_coverage=1 00:02:34.279 --rc genhtml_legend=1 00:02:34.279 --rc geninfo_all_blocks=1 00:02:34.279 --rc geninfo_unexecuted_blocks=1 00:02:34.279 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:34.279 ' 00:02:34.279 09:25:56 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:34.279 09:25:56 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:02:34.279 09:25:56 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:02:34.279 09:25:56 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:02:34.279 09:25:56 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:02:34.279 09:25:56 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:02:34.279 09:25:56 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:02:34.279 09:25:56 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:34.279 09:25:56 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:02:34.279 09:25:56 -- setup/acl.sh@12 -- # devs=() 00:02:34.279 09:25:56 -- setup/acl.sh@12 -- # declare -a devs 00:02:34.279 09:25:56 -- setup/acl.sh@13 -- # drivers=() 00:02:34.279 09:25:56 -- setup/acl.sh@13 -- # declare -A drivers 00:02:34.279 09:25:56 -- setup/acl.sh@51 -- # setup reset 00:02:34.279 09:25:56 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:34.279 09:25:56 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:37.569 09:26:00 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:37.569 09:26:00 -- setup/acl.sh@16 -- # local dev driver 00:02:37.569 09:26:00 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:37.569 09:26:00 -- setup/acl.sh@15 -- # setup output status 00:02:37.569 09:26:00 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:37.569 09:26:00 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:40.859 Hugepages 00:02:40.859 node hugesize free / total 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.859 00:02:40.859 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.859 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.859 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.859 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:40.859 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.859 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.859 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # continue 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:40.860 09:26:03 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:02:40.860 09:26:03 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:40.860 09:26:03 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:40.860 09:26:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:40.860 09:26:03 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:40.860 09:26:03 -- setup/acl.sh@54 -- # run_test denied denied 00:02:40.860 09:26:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:40.860 09:26:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:40.860 09:26:03 -- common/autotest_common.sh@10 -- # set +x 00:02:40.860 ************************************ 00:02:40.860 START TEST denied 00:02:40.860 ************************************ 00:02:40.860 09:26:03 -- common/autotest_common.sh@1114 -- # denied 00:02:40.860 09:26:03 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:02:40.860 09:26:03 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:02:40.860 09:26:03 -- setup/acl.sh@38 -- # setup output config 00:02:40.860 09:26:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:40.860 09:26:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:45.055 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:02:45.055 09:26:07 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:02:45.055 09:26:07 -- setup/acl.sh@28 -- # local dev driver 00:02:45.055 09:26:07 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:45.055 09:26:07 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:02:45.055 09:26:07 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:02:45.055 09:26:07 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:45.055 09:26:07 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:45.055 09:26:07 -- setup/acl.sh@41 -- # setup reset 00:02:45.055 09:26:07 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:45.055 09:26:07 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:49.249 00:02:49.249 real 0m8.444s 00:02:49.249 user 0m2.711s 00:02:49.249 sys 0m5.118s 00:02:49.249 09:26:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:02:49.249 09:26:11 -- common/autotest_common.sh@10 -- # set +x 00:02:49.249 ************************************ 00:02:49.249 END TEST denied 00:02:49.249 ************************************ 00:02:49.249 09:26:11 -- setup/acl.sh@55 -- # run_test allowed allowed 00:02:49.249 09:26:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:49.249 09:26:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:49.249 09:26:11 -- common/autotest_common.sh@10 -- # set +x 00:02:49.249 ************************************ 00:02:49.249 START TEST allowed 00:02:49.249 ************************************ 00:02:49.249 09:26:12 -- common/autotest_common.sh@1114 -- # allowed 00:02:49.249 09:26:12 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:02:49.249 09:26:12 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:02:49.249 09:26:12 -- setup/acl.sh@45 -- # setup output config 00:02:49.249 09:26:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:49.249 09:26:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:54.527 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:02:54.527 09:26:16 -- setup/acl.sh@47 -- # verify 00:02:54.527 09:26:16 -- setup/acl.sh@28 -- # local dev driver 00:02:54.527 09:26:16 -- setup/acl.sh@48 -- # setup reset 00:02:54.527 09:26:16 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:54.527 09:26:16 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:57.820 00:02:57.820 real 0m8.620s 00:02:57.820 user 0m2.437s 00:02:57.820 sys 0m4.862s 00:02:57.820 09:26:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:02:57.820 09:26:20 -- common/autotest_common.sh@10 -- # set +x 00:02:57.820 ************************************ 00:02:57.820 END TEST allowed 00:02:57.820 ************************************ 00:02:57.820 00:02:57.820 real 0m24.404s 00:02:57.820 user 0m7.773s 00:02:57.820 sys 0m14.949s 00:02:57.820 09:26:20 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:02:57.820 09:26:20 -- common/autotest_common.sh@10 -- # set +x 00:02:57.820 ************************************ 00:02:57.820 END TEST acl 00:02:57.820 ************************************ 00:02:58.080 09:26:20 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:02:58.080 09:26:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:58.080 09:26:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:58.080 09:26:20 -- common/autotest_common.sh@10 -- # set +x 00:02:58.080 ************************************ 00:02:58.080 START TEST hugepages 00:02:58.080 ************************************ 00:02:58.080 09:26:20 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:02:58.080 * Looking for test storage... 00:02:58.080 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:58.080 09:26:20 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:58.080 09:26:20 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:58.080 09:26:20 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:58.080 09:26:20 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:58.080 09:26:20 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:58.080 09:26:20 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:58.080 09:26:20 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:58.080 09:26:20 -- scripts/common.sh@335 -- # IFS=.-: 00:02:58.080 09:26:20 -- scripts/common.sh@335 -- # read -ra ver1 00:02:58.080 09:26:20 -- scripts/common.sh@336 -- # IFS=.-: 00:02:58.080 09:26:20 -- scripts/common.sh@336 -- # read -ra ver2 00:02:58.080 09:26:20 -- scripts/common.sh@337 -- # local 'op=<' 00:02:58.080 09:26:20 -- scripts/common.sh@339 -- # ver1_l=2 00:02:58.080 09:26:20 -- scripts/common.sh@340 -- # ver2_l=1 00:02:58.080 09:26:20 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:58.080 09:26:20 -- scripts/common.sh@343 -- # case "$op" in 00:02:58.080 09:26:20 -- scripts/common.sh@344 -- # : 1 00:02:58.080 09:26:20 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:58.080 09:26:20 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:58.080 09:26:20 -- scripts/common.sh@364 -- # decimal 1 00:02:58.080 09:26:20 -- scripts/common.sh@352 -- # local d=1 00:02:58.080 09:26:20 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:58.080 09:26:20 -- scripts/common.sh@354 -- # echo 1 00:02:58.080 09:26:20 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:58.080 09:26:20 -- scripts/common.sh@365 -- # decimal 2 00:02:58.080 09:26:20 -- scripts/common.sh@352 -- # local d=2 00:02:58.080 09:26:20 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:58.080 09:26:20 -- scripts/common.sh@354 -- # echo 2 00:02:58.080 09:26:20 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:58.080 09:26:20 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:58.080 09:26:20 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:58.080 09:26:20 -- scripts/common.sh@367 -- # return 0 00:02:58.080 09:26:20 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:58.080 09:26:20 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:58.080 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:58.080 --rc genhtml_branch_coverage=1 00:02:58.080 --rc genhtml_function_coverage=1 00:02:58.080 --rc genhtml_legend=1 00:02:58.080 --rc geninfo_all_blocks=1 00:02:58.080 --rc geninfo_unexecuted_blocks=1 00:02:58.080 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:58.080 ' 00:02:58.080 09:26:20 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:58.080 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:58.080 --rc genhtml_branch_coverage=1 00:02:58.080 --rc genhtml_function_coverage=1 00:02:58.080 --rc genhtml_legend=1 00:02:58.080 --rc geninfo_all_blocks=1 00:02:58.080 --rc geninfo_unexecuted_blocks=1 00:02:58.080 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:58.080 ' 00:02:58.080 09:26:20 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:58.080 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:58.080 --rc genhtml_branch_coverage=1 00:02:58.080 --rc genhtml_function_coverage=1 00:02:58.080 --rc genhtml_legend=1 00:02:58.080 --rc geninfo_all_blocks=1 00:02:58.080 --rc geninfo_unexecuted_blocks=1 00:02:58.080 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:58.080 ' 00:02:58.080 09:26:20 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:58.080 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:58.080 --rc genhtml_branch_coverage=1 00:02:58.080 --rc genhtml_function_coverage=1 00:02:58.080 --rc genhtml_legend=1 00:02:58.080 --rc geninfo_all_blocks=1 00:02:58.080 --rc geninfo_unexecuted_blocks=1 00:02:58.080 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:58.080 ' 00:02:58.080 09:26:20 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:02:58.080 09:26:20 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:02:58.080 09:26:20 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:02:58.080 09:26:20 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:02:58.081 09:26:20 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:02:58.081 09:26:20 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:02:58.081 09:26:20 -- setup/common.sh@17 -- # local get=Hugepagesize 00:02:58.081 09:26:20 -- setup/common.sh@18 -- # local node= 00:02:58.081 09:26:20 -- setup/common.sh@19 -- # local var val 00:02:58.081 09:26:20 -- setup/common.sh@20 -- # local mem_f mem 00:02:58.081 09:26:20 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:02:58.081 09:26:20 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:02:58.081 09:26:20 -- setup/common.sh@25 -- # [[ -n '' ]] 00:02:58.081 09:26:20 -- setup/common.sh@28 -- # mapfile -t mem 00:02:58.081 09:26:20 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 41398008 kB' 'MemAvailable: 43032524 kB' 'Buffers: 6816 kB' 'Cached: 9298752 kB' 'SwapCached: 180 kB' 'Active: 6727380 kB' 'Inactive: 3166168 kB' 'Active(anon): 5819900 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591268 kB' 'Mapped: 169200 kB' 'Shmem: 7556864 kB' 'KReclaimable: 582612 kB' 'Slab: 1586904 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1004292 kB' 'KernelStack: 22016 kB' 'PageTables: 9080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433348 kB' 'Committed_AS: 10078276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 217924 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.081 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.081 09:26:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # continue 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # IFS=': ' 00:02:58.082 09:26:20 -- setup/common.sh@31 -- # read -r var val _ 00:02:58.082 09:26:20 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:02:58.082 09:26:20 -- setup/common.sh@33 -- # echo 2048 00:02:58.082 09:26:20 -- setup/common.sh@33 -- # return 0 00:02:58.082 09:26:20 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:02:58.342 09:26:20 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:02:58.342 09:26:20 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:02:58.342 09:26:20 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:02:58.342 09:26:20 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:02:58.342 09:26:20 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:02:58.342 09:26:20 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:02:58.342 09:26:20 -- setup/hugepages.sh@207 -- # get_nodes 00:02:58.342 09:26:20 -- setup/hugepages.sh@27 -- # local node 00:02:58.342 09:26:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:02:58.342 09:26:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:02:58.342 09:26:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:02:58.342 09:26:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:02:58.342 09:26:20 -- setup/hugepages.sh@32 -- # no_nodes=2 00:02:58.342 09:26:20 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:02:58.342 09:26:20 -- setup/hugepages.sh@208 -- # clear_hp 00:02:58.342 09:26:20 -- setup/hugepages.sh@37 -- # local node hp 00:02:58.342 09:26:20 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:02:58.342 09:26:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:02:58.342 09:26:20 -- setup/hugepages.sh@41 -- # echo 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:02:58.342 09:26:20 -- setup/hugepages.sh@41 -- # echo 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:02:58.342 09:26:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:02:58.342 09:26:20 -- setup/hugepages.sh@41 -- # echo 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:02:58.342 09:26:20 -- setup/hugepages.sh@41 -- # echo 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:02:58.342 09:26:20 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:02:58.342 09:26:20 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:02:58.342 09:26:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:02:58.342 09:26:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:02:58.342 09:26:20 -- common/autotest_common.sh@10 -- # set +x 00:02:58.342 ************************************ 00:02:58.342 START TEST default_setup 00:02:58.342 ************************************ 00:02:58.342 09:26:20 -- common/autotest_common.sh@1114 -- # default_setup 00:02:58.342 09:26:20 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@49 -- # local size=2097152 00:02:58.342 09:26:20 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:02:58.342 09:26:20 -- setup/hugepages.sh@51 -- # shift 00:02:58.342 09:26:20 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:02:58.342 09:26:20 -- setup/hugepages.sh@52 -- # local node_ids 00:02:58.342 09:26:20 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:02:58.342 09:26:20 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:02:58.342 09:26:20 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:02:58.342 09:26:20 -- setup/hugepages.sh@62 -- # local user_nodes 00:02:58.342 09:26:20 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:02:58.342 09:26:20 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:02:58.342 09:26:20 -- setup/hugepages.sh@67 -- # nodes_test=() 00:02:58.342 09:26:20 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:02:58.342 09:26:20 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:02:58.342 09:26:20 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:02:58.342 09:26:20 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:02:58.342 09:26:20 -- setup/hugepages.sh@73 -- # return 0 00:02:58.342 09:26:20 -- setup/hugepages.sh@137 -- # setup output 00:02:58.342 09:26:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:58.342 09:26:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:01.633 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:01.633 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:01.633 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:01.633 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:01.633 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:01.634 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:03.018 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:03.018 09:26:25 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:03.018 09:26:25 -- setup/hugepages.sh@89 -- # local node 00:03:03.018 09:26:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:03.018 09:26:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:03.018 09:26:25 -- setup/hugepages.sh@92 -- # local surp 00:03:03.018 09:26:25 -- setup/hugepages.sh@93 -- # local resv 00:03:03.018 09:26:25 -- setup/hugepages.sh@94 -- # local anon 00:03:03.018 09:26:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:03.018 09:26:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:03.018 09:26:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:03.018 09:26:25 -- setup/common.sh@18 -- # local node= 00:03:03.018 09:26:25 -- setup/common.sh@19 -- # local var val 00:03:03.018 09:26:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:03.018 09:26:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:03.018 09:26:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:03.018 09:26:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:03.018 09:26:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:03.018 09:26:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43585400 kB' 'MemAvailable: 45219916 kB' 'Buffers: 6816 kB' 'Cached: 9298884 kB' 'SwapCached: 180 kB' 'Active: 6728400 kB' 'Inactive: 3166168 kB' 'Active(anon): 5820920 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592176 kB' 'Mapped: 169252 kB' 'Shmem: 7556996 kB' 'KReclaimable: 582612 kB' 'Slab: 1585856 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003244 kB' 'KernelStack: 22064 kB' 'PageTables: 8956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10078692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.019 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.019 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:03.020 09:26:25 -- setup/common.sh@33 -- # echo 0 00:03:03.020 09:26:25 -- setup/common.sh@33 -- # return 0 00:03:03.020 09:26:25 -- setup/hugepages.sh@97 -- # anon=0 00:03:03.020 09:26:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:03.020 09:26:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:03.020 09:26:25 -- setup/common.sh@18 -- # local node= 00:03:03.020 09:26:25 -- setup/common.sh@19 -- # local var val 00:03:03.020 09:26:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:03.020 09:26:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:03.020 09:26:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:03.020 09:26:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:03.020 09:26:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:03.020 09:26:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43587996 kB' 'MemAvailable: 45222512 kB' 'Buffers: 6816 kB' 'Cached: 9298888 kB' 'SwapCached: 180 kB' 'Active: 6728840 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821360 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592616 kB' 'Mapped: 169296 kB' 'Shmem: 7557000 kB' 'KReclaimable: 582612 kB' 'Slab: 1585948 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003336 kB' 'KernelStack: 22064 kB' 'PageTables: 8820 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10080216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.020 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.020 09:26:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.021 09:26:25 -- setup/common.sh@33 -- # echo 0 00:03:03.021 09:26:25 -- setup/common.sh@33 -- # return 0 00:03:03.021 09:26:25 -- setup/hugepages.sh@99 -- # surp=0 00:03:03.021 09:26:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:03.021 09:26:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:03.021 09:26:25 -- setup/common.sh@18 -- # local node= 00:03:03.021 09:26:25 -- setup/common.sh@19 -- # local var val 00:03:03.021 09:26:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:03.021 09:26:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:03.021 09:26:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:03.021 09:26:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:03.021 09:26:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:03.021 09:26:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.021 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.021 09:26:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43594756 kB' 'MemAvailable: 45229272 kB' 'Buffers: 6816 kB' 'Cached: 9298900 kB' 'SwapCached: 180 kB' 'Active: 6729724 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822244 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593580 kB' 'Mapped: 169312 kB' 'Shmem: 7557012 kB' 'KReclaimable: 582612 kB' 'Slab: 1585900 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003288 kB' 'KernelStack: 22336 kB' 'PageTables: 9696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10080228 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.022 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.022 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:03.023 09:26:25 -- setup/common.sh@33 -- # echo 0 00:03:03.023 09:26:25 -- setup/common.sh@33 -- # return 0 00:03:03.023 09:26:25 -- setup/hugepages.sh@100 -- # resv=0 00:03:03.023 09:26:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:03.023 nr_hugepages=1024 00:03:03.023 09:26:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:03.023 resv_hugepages=0 00:03:03.023 09:26:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:03.023 surplus_hugepages=0 00:03:03.023 09:26:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:03.023 anon_hugepages=0 00:03:03.023 09:26:25 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:03.023 09:26:25 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:03.023 09:26:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:03.023 09:26:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:03.023 09:26:25 -- setup/common.sh@18 -- # local node= 00:03:03.023 09:26:25 -- setup/common.sh@19 -- # local var val 00:03:03.023 09:26:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:03.023 09:26:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:03.023 09:26:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:03.023 09:26:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:03.023 09:26:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:03.023 09:26:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43593936 kB' 'MemAvailable: 45228452 kB' 'Buffers: 6816 kB' 'Cached: 9298900 kB' 'SwapCached: 180 kB' 'Active: 6729660 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822180 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593444 kB' 'Mapped: 169296 kB' 'Shmem: 7557012 kB' 'KReclaimable: 582612 kB' 'Slab: 1585772 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003160 kB' 'KernelStack: 22272 kB' 'PageTables: 9708 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10080244 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.023 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.023 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.024 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.024 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:03.024 09:26:25 -- setup/common.sh@33 -- # echo 1024 00:03:03.024 09:26:25 -- setup/common.sh@33 -- # return 0 00:03:03.024 09:26:25 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:03.024 09:26:25 -- setup/hugepages.sh@112 -- # get_nodes 00:03:03.024 09:26:25 -- setup/hugepages.sh@27 -- # local node 00:03:03.024 09:26:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:03.024 09:26:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:03.024 09:26:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:03.024 09:26:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:03.024 09:26:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:03.024 09:26:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:03.024 09:26:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:03.024 09:26:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:03.024 09:26:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:03.024 09:26:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:03.025 09:26:25 -- setup/common.sh@18 -- # local node=0 00:03:03.025 09:26:25 -- setup/common.sh@19 -- # local var val 00:03:03.025 09:26:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:03.025 09:26:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:03.025 09:26:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:03.025 09:26:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:03.025 09:26:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:03.025 09:26:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23693164 kB' 'MemUsed: 8941272 kB' 'SwapCached: 80 kB' 'Active: 3965528 kB' 'Inactive: 535356 kB' 'Active(anon): 3187964 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4220776 kB' 'Mapped: 104968 kB' 'AnonPages: 283284 kB' 'Shmem: 2907928 kB' 'KernelStack: 10328 kB' 'PageTables: 5792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876464 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 482604 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.025 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.025 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.026 09:26:25 -- setup/common.sh@32 -- # continue 00:03:03.026 09:26:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:03.026 09:26:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:03.026 09:26:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:03.026 09:26:25 -- setup/common.sh@33 -- # echo 0 00:03:03.026 09:26:25 -- setup/common.sh@33 -- # return 0 00:03:03.026 09:26:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:03.026 09:26:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:03.026 09:26:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:03.026 09:26:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:03.026 09:26:25 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:03.026 node0=1024 expecting 1024 00:03:03.026 09:26:25 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:03.026 00:03:03.026 real 0m4.882s 00:03:03.026 user 0m1.171s 00:03:03.026 sys 0m2.266s 00:03:03.026 09:26:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:03.026 09:26:25 -- common/autotest_common.sh@10 -- # set +x 00:03:03.026 ************************************ 00:03:03.026 END TEST default_setup 00:03:03.026 ************************************ 00:03:03.285 09:26:25 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:03.286 09:26:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:03.286 09:26:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:03.286 09:26:25 -- common/autotest_common.sh@10 -- # set +x 00:03:03.286 ************************************ 00:03:03.286 START TEST per_node_1G_alloc 00:03:03.286 ************************************ 00:03:03.286 09:26:25 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:03.286 09:26:25 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:03.286 09:26:25 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:03.286 09:26:25 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:03.286 09:26:25 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:03.286 09:26:25 -- setup/hugepages.sh@51 -- # shift 00:03:03.286 09:26:25 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:03.286 09:26:25 -- setup/hugepages.sh@52 -- # local node_ids 00:03:03.286 09:26:25 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:03.286 09:26:25 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:03.286 09:26:25 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:03.286 09:26:25 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:03.286 09:26:25 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:03.286 09:26:25 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:03.286 09:26:25 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:03.286 09:26:25 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:03.286 09:26:25 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:03.286 09:26:25 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:03.286 09:26:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:03.286 09:26:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:03.286 09:26:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:03.286 09:26:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:03.286 09:26:25 -- setup/hugepages.sh@73 -- # return 0 00:03:03.286 09:26:25 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:03.286 09:26:25 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:03.286 09:26:25 -- setup/hugepages.sh@146 -- # setup output 00:03:03.286 09:26:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:03.286 09:26:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:05.820 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:05.820 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:06.083 09:26:28 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:06.084 09:26:28 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:06.084 09:26:28 -- setup/hugepages.sh@89 -- # local node 00:03:06.084 09:26:28 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:06.084 09:26:28 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:06.084 09:26:28 -- setup/hugepages.sh@92 -- # local surp 00:03:06.084 09:26:28 -- setup/hugepages.sh@93 -- # local resv 00:03:06.084 09:26:28 -- setup/hugepages.sh@94 -- # local anon 00:03:06.084 09:26:28 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:06.084 09:26:28 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:06.084 09:26:28 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:06.084 09:26:28 -- setup/common.sh@18 -- # local node= 00:03:06.084 09:26:28 -- setup/common.sh@19 -- # local var val 00:03:06.084 09:26:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:06.084 09:26:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:06.084 09:26:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:06.084 09:26:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:06.084 09:26:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:06.084 09:26:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43619876 kB' 'MemAvailable: 45254392 kB' 'Buffers: 6816 kB' 'Cached: 9299000 kB' 'SwapCached: 180 kB' 'Active: 6729976 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822496 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592912 kB' 'Mapped: 169344 kB' 'Shmem: 7557112 kB' 'KReclaimable: 582612 kB' 'Slab: 1586172 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003560 kB' 'KernelStack: 22032 kB' 'PageTables: 8872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10080728 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218180 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.084 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.084 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:06.085 09:26:28 -- setup/common.sh@33 -- # echo 0 00:03:06.085 09:26:28 -- setup/common.sh@33 -- # return 0 00:03:06.085 09:26:28 -- setup/hugepages.sh@97 -- # anon=0 00:03:06.085 09:26:28 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:06.085 09:26:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:06.085 09:26:28 -- setup/common.sh@18 -- # local node= 00:03:06.085 09:26:28 -- setup/common.sh@19 -- # local var val 00:03:06.085 09:26:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:06.085 09:26:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:06.085 09:26:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:06.085 09:26:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:06.085 09:26:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:06.085 09:26:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43618540 kB' 'MemAvailable: 45253056 kB' 'Buffers: 6816 kB' 'Cached: 9299004 kB' 'SwapCached: 180 kB' 'Active: 6729884 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822404 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593500 kB' 'Mapped: 169300 kB' 'Shmem: 7557116 kB' 'KReclaimable: 582612 kB' 'Slab: 1586304 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003692 kB' 'KernelStack: 22224 kB' 'PageTables: 9372 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10081108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218212 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.085 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.085 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.086 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.086 09:26:28 -- setup/common.sh@33 -- # echo 0 00:03:06.086 09:26:28 -- setup/common.sh@33 -- # return 0 00:03:06.086 09:26:28 -- setup/hugepages.sh@99 -- # surp=0 00:03:06.086 09:26:28 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:06.086 09:26:28 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:06.086 09:26:28 -- setup/common.sh@18 -- # local node= 00:03:06.086 09:26:28 -- setup/common.sh@19 -- # local var val 00:03:06.086 09:26:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:06.086 09:26:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:06.086 09:26:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:06.086 09:26:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:06.086 09:26:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:06.086 09:26:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.086 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43618020 kB' 'MemAvailable: 45252536 kB' 'Buffers: 6816 kB' 'Cached: 9299004 kB' 'SwapCached: 180 kB' 'Active: 6729584 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822104 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593152 kB' 'Mapped: 169300 kB' 'Shmem: 7557116 kB' 'KReclaimable: 582612 kB' 'Slab: 1586332 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003720 kB' 'KernelStack: 22176 kB' 'PageTables: 9068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10081124 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218196 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.087 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.087 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:06.088 09:26:28 -- setup/common.sh@33 -- # echo 0 00:03:06.088 09:26:28 -- setup/common.sh@33 -- # return 0 00:03:06.088 09:26:28 -- setup/hugepages.sh@100 -- # resv=0 00:03:06.088 09:26:28 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:06.088 nr_hugepages=1024 00:03:06.088 09:26:28 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:06.088 resv_hugepages=0 00:03:06.088 09:26:28 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:06.088 surplus_hugepages=0 00:03:06.088 09:26:28 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:06.088 anon_hugepages=0 00:03:06.088 09:26:28 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:06.088 09:26:28 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:06.088 09:26:28 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:06.088 09:26:28 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:06.088 09:26:28 -- setup/common.sh@18 -- # local node= 00:03:06.088 09:26:28 -- setup/common.sh@19 -- # local var val 00:03:06.088 09:26:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:06.088 09:26:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:06.088 09:26:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:06.088 09:26:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:06.088 09:26:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:06.088 09:26:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:06.088 09:26:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43617088 kB' 'MemAvailable: 45251604 kB' 'Buffers: 6816 kB' 'Cached: 9299028 kB' 'SwapCached: 180 kB' 'Active: 6729680 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822200 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593224 kB' 'Mapped: 169300 kB' 'Shmem: 7557140 kB' 'KReclaimable: 582612 kB' 'Slab: 1586332 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003720 kB' 'KernelStack: 22080 kB' 'PageTables: 9196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10081136 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218228 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.088 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.088 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.089 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.089 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:06.089 09:26:28 -- setup/common.sh@33 -- # echo 1024 00:03:06.089 09:26:28 -- setup/common.sh@33 -- # return 0 00:03:06.089 09:26:28 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:06.089 09:26:28 -- setup/hugepages.sh@112 -- # get_nodes 00:03:06.089 09:26:28 -- setup/hugepages.sh@27 -- # local node 00:03:06.089 09:26:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:06.089 09:26:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:06.090 09:26:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:06.090 09:26:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:06.090 09:26:28 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:06.090 09:26:28 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:06.090 09:26:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:06.090 09:26:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:06.090 09:26:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:06.090 09:26:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:06.090 09:26:28 -- setup/common.sh@18 -- # local node=0 00:03:06.090 09:26:28 -- setup/common.sh@19 -- # local var val 00:03:06.090 09:26:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:06.090 09:26:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:06.090 09:26:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:06.090 09:26:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:06.090 09:26:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:06.090 09:26:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24741952 kB' 'MemUsed: 7892484 kB' 'SwapCached: 80 kB' 'Active: 3965320 kB' 'Inactive: 535356 kB' 'Active(anon): 3187756 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4220780 kB' 'Mapped: 104972 kB' 'AnonPages: 283068 kB' 'Shmem: 2907932 kB' 'KernelStack: 10136 kB' 'PageTables: 4944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876972 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 483112 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.090 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.090 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@33 -- # echo 0 00:03:06.091 09:26:28 -- setup/common.sh@33 -- # return 0 00:03:06.091 09:26:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:06.091 09:26:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:06.091 09:26:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:06.091 09:26:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:06.091 09:26:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:06.091 09:26:28 -- setup/common.sh@18 -- # local node=1 00:03:06.091 09:26:28 -- setup/common.sh@19 -- # local var val 00:03:06.091 09:26:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:06.091 09:26:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:06.091 09:26:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:06.091 09:26:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:06.091 09:26:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:06.091 09:26:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18874416 kB' 'MemUsed: 8774944 kB' 'SwapCached: 100 kB' 'Active: 2764308 kB' 'Inactive: 2630812 kB' 'Active(anon): 2634392 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5085272 kB' 'Mapped: 63980 kB' 'AnonPages: 310088 kB' 'Shmem: 4649236 kB' 'KernelStack: 11992 kB' 'PageTables: 3876 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 188752 kB' 'Slab: 709360 kB' 'SReclaimable: 188752 kB' 'SUnreclaim: 520608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.091 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.091 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # continue 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:06.092 09:26:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:06.092 09:26:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:06.092 09:26:28 -- setup/common.sh@33 -- # echo 0 00:03:06.092 09:26:28 -- setup/common.sh@33 -- # return 0 00:03:06.092 09:26:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:06.092 09:26:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:06.092 09:26:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:06.092 09:26:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:06.092 09:26:28 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:06.092 node0=512 expecting 512 00:03:06.092 09:26:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:06.092 09:26:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:06.092 09:26:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:06.092 09:26:28 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:06.092 node1=512 expecting 512 00:03:06.092 09:26:28 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:06.092 00:03:06.092 real 0m3.010s 00:03:06.092 user 0m0.990s 00:03:06.092 sys 0m1.867s 00:03:06.092 09:26:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:06.092 09:26:28 -- common/autotest_common.sh@10 -- # set +x 00:03:06.092 ************************************ 00:03:06.092 END TEST per_node_1G_alloc 00:03:06.092 ************************************ 00:03:06.352 09:26:28 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:06.352 09:26:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:06.352 09:26:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:06.352 09:26:28 -- common/autotest_common.sh@10 -- # set +x 00:03:06.352 ************************************ 00:03:06.352 START TEST even_2G_alloc 00:03:06.352 ************************************ 00:03:06.352 09:26:28 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:03:06.352 09:26:28 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:06.352 09:26:28 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:06.352 09:26:28 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:06.352 09:26:28 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:06.352 09:26:28 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:06.352 09:26:28 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:06.352 09:26:28 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:06.352 09:26:28 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:06.352 09:26:28 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:06.352 09:26:28 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:06.352 09:26:28 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:06.352 09:26:28 -- setup/hugepages.sh@83 -- # : 512 00:03:06.352 09:26:28 -- setup/hugepages.sh@84 -- # : 1 00:03:06.352 09:26:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:06.352 09:26:28 -- setup/hugepages.sh@83 -- # : 0 00:03:06.352 09:26:28 -- setup/hugepages.sh@84 -- # : 0 00:03:06.352 09:26:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:06.352 09:26:28 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:06.352 09:26:28 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:06.352 09:26:28 -- setup/hugepages.sh@153 -- # setup output 00:03:06.352 09:26:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:06.352 09:26:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:09.649 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:09.649 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:09.649 09:26:32 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:09.649 09:26:32 -- setup/hugepages.sh@89 -- # local node 00:03:09.649 09:26:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:09.649 09:26:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:09.649 09:26:32 -- setup/hugepages.sh@92 -- # local surp 00:03:09.649 09:26:32 -- setup/hugepages.sh@93 -- # local resv 00:03:09.649 09:26:32 -- setup/hugepages.sh@94 -- # local anon 00:03:09.649 09:26:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:09.649 09:26:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:09.649 09:26:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:09.649 09:26:32 -- setup/common.sh@18 -- # local node= 00:03:09.649 09:26:32 -- setup/common.sh@19 -- # local var val 00:03:09.649 09:26:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.649 09:26:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.649 09:26:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.649 09:26:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.649 09:26:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.649 09:26:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43634544 kB' 'MemAvailable: 45269060 kB' 'Buffers: 6816 kB' 'Cached: 9299132 kB' 'SwapCached: 180 kB' 'Active: 6728536 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821056 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591348 kB' 'Mapped: 168240 kB' 'Shmem: 7557244 kB' 'KReclaimable: 582612 kB' 'Slab: 1586628 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1004016 kB' 'KernelStack: 21920 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10069620 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218052 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.649 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.649 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:09.650 09:26:32 -- setup/common.sh@33 -- # echo 0 00:03:09.650 09:26:32 -- setup/common.sh@33 -- # return 0 00:03:09.650 09:26:32 -- setup/hugepages.sh@97 -- # anon=0 00:03:09.650 09:26:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:09.650 09:26:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:09.650 09:26:32 -- setup/common.sh@18 -- # local node= 00:03:09.650 09:26:32 -- setup/common.sh@19 -- # local var val 00:03:09.650 09:26:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.650 09:26:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.650 09:26:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.650 09:26:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.650 09:26:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.650 09:26:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43635868 kB' 'MemAvailable: 45270384 kB' 'Buffers: 6816 kB' 'Cached: 9299136 kB' 'SwapCached: 180 kB' 'Active: 6727768 kB' 'Inactive: 3166168 kB' 'Active(anon): 5820288 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591124 kB' 'Mapped: 168144 kB' 'Shmem: 7557248 kB' 'KReclaimable: 582612 kB' 'Slab: 1586576 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003964 kB' 'KernelStack: 21904 kB' 'PageTables: 8592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10069632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.650 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.650 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.651 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.651 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.651 09:26:32 -- setup/common.sh@33 -- # echo 0 00:03:09.651 09:26:32 -- setup/common.sh@33 -- # return 0 00:03:09.651 09:26:32 -- setup/hugepages.sh@99 -- # surp=0 00:03:09.651 09:26:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:09.651 09:26:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:09.651 09:26:32 -- setup/common.sh@18 -- # local node= 00:03:09.651 09:26:32 -- setup/common.sh@19 -- # local var val 00:03:09.651 09:26:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.651 09:26:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.652 09:26:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.652 09:26:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.652 09:26:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.652 09:26:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43635364 kB' 'MemAvailable: 45269880 kB' 'Buffers: 6816 kB' 'Cached: 9299148 kB' 'SwapCached: 180 kB' 'Active: 6727784 kB' 'Inactive: 3166168 kB' 'Active(anon): 5820304 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 591124 kB' 'Mapped: 168144 kB' 'Shmem: 7557260 kB' 'KReclaimable: 582612 kB' 'Slab: 1586576 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003964 kB' 'KernelStack: 21904 kB' 'PageTables: 8592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10069648 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.652 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.652 09:26:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:09.653 09:26:32 -- setup/common.sh@33 -- # echo 0 00:03:09.653 09:26:32 -- setup/common.sh@33 -- # return 0 00:03:09.653 09:26:32 -- setup/hugepages.sh@100 -- # resv=0 00:03:09.653 09:26:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:09.653 nr_hugepages=1024 00:03:09.653 09:26:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:09.653 resv_hugepages=0 00:03:09.653 09:26:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:09.653 surplus_hugepages=0 00:03:09.653 09:26:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:09.653 anon_hugepages=0 00:03:09.653 09:26:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:09.653 09:26:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:09.653 09:26:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:09.653 09:26:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:09.653 09:26:32 -- setup/common.sh@18 -- # local node= 00:03:09.653 09:26:32 -- setup/common.sh@19 -- # local var val 00:03:09.653 09:26:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.653 09:26:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.653 09:26:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:09.653 09:26:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:09.653 09:26:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.653 09:26:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.653 09:26:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43635364 kB' 'MemAvailable: 45269880 kB' 'Buffers: 6816 kB' 'Cached: 9299172 kB' 'SwapCached: 180 kB' 'Active: 6727424 kB' 'Inactive: 3166168 kB' 'Active(anon): 5819944 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 590712 kB' 'Mapped: 168144 kB' 'Shmem: 7557284 kB' 'KReclaimable: 582612 kB' 'Slab: 1586576 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003964 kB' 'KernelStack: 21888 kB' 'PageTables: 8544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10069664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.653 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.653 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.654 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.654 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:09.654 09:26:32 -- setup/common.sh@33 -- # echo 1024 00:03:09.655 09:26:32 -- setup/common.sh@33 -- # return 0 00:03:09.655 09:26:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:09.655 09:26:32 -- setup/hugepages.sh@112 -- # get_nodes 00:03:09.655 09:26:32 -- setup/hugepages.sh@27 -- # local node 00:03:09.655 09:26:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:09.655 09:26:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:09.655 09:26:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:09.655 09:26:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:09.655 09:26:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:09.655 09:26:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:09.655 09:26:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:09.655 09:26:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:09.655 09:26:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:09.655 09:26:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:09.655 09:26:32 -- setup/common.sh@18 -- # local node=0 00:03:09.655 09:26:32 -- setup/common.sh@19 -- # local var val 00:03:09.655 09:26:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.655 09:26:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.655 09:26:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:09.655 09:26:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:09.655 09:26:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.655 09:26:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24755988 kB' 'MemUsed: 7878448 kB' 'SwapCached: 80 kB' 'Active: 3963772 kB' 'Inactive: 535356 kB' 'Active(anon): 3186208 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4220824 kB' 'Mapped: 104164 kB' 'AnonPages: 281440 kB' 'Shmem: 2907976 kB' 'KernelStack: 9896 kB' 'PageTables: 4648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876808 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 482948 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.655 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.655 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@33 -- # echo 0 00:03:09.656 09:26:32 -- setup/common.sh@33 -- # return 0 00:03:09.656 09:26:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:09.656 09:26:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:09.656 09:26:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:09.656 09:26:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:09.656 09:26:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:09.656 09:26:32 -- setup/common.sh@18 -- # local node=1 00:03:09.656 09:26:32 -- setup/common.sh@19 -- # local var val 00:03:09.656 09:26:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:09.656 09:26:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:09.656 09:26:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:09.656 09:26:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:09.656 09:26:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:09.656 09:26:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18879988 kB' 'MemUsed: 8769372 kB' 'SwapCached: 100 kB' 'Active: 2764084 kB' 'Inactive: 2630812 kB' 'Active(anon): 2634168 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5085364 kB' 'Mapped: 63980 kB' 'AnonPages: 309676 kB' 'Shmem: 4649328 kB' 'KernelStack: 12008 kB' 'PageTables: 3944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 188752 kB' 'Slab: 709768 kB' 'SReclaimable: 188752 kB' 'SUnreclaim: 521016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.656 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.656 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # continue 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:09.657 09:26:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:09.657 09:26:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:09.657 09:26:32 -- setup/common.sh@33 -- # echo 0 00:03:09.657 09:26:32 -- setup/common.sh@33 -- # return 0 00:03:09.657 09:26:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:09.657 09:26:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:09.657 09:26:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:09.657 09:26:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:09.657 node0=512 expecting 512 00:03:09.657 09:26:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:09.657 09:26:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:09.657 09:26:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:09.657 09:26:32 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:09.657 node1=512 expecting 512 00:03:09.657 09:26:32 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:09.657 00:03:09.657 real 0m3.430s 00:03:09.657 user 0m1.265s 00:03:09.657 sys 0m2.193s 00:03:09.657 09:26:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:09.657 09:26:32 -- common/autotest_common.sh@10 -- # set +x 00:03:09.657 ************************************ 00:03:09.657 END TEST even_2G_alloc 00:03:09.657 ************************************ 00:03:09.657 09:26:32 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:09.657 09:26:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:09.657 09:26:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:09.657 09:26:32 -- common/autotest_common.sh@10 -- # set +x 00:03:09.657 ************************************ 00:03:09.657 START TEST odd_alloc 00:03:09.657 ************************************ 00:03:09.657 09:26:32 -- common/autotest_common.sh@1114 -- # odd_alloc 00:03:09.657 09:26:32 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:09.657 09:26:32 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:09.657 09:26:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:09.657 09:26:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:09.657 09:26:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:09.657 09:26:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:09.657 09:26:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:09.657 09:26:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:09.657 09:26:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:09.657 09:26:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:09.657 09:26:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:09.657 09:26:32 -- setup/hugepages.sh@83 -- # : 513 00:03:09.657 09:26:32 -- setup/hugepages.sh@84 -- # : 1 00:03:09.657 09:26:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:09.657 09:26:32 -- setup/hugepages.sh@83 -- # : 0 00:03:09.657 09:26:32 -- setup/hugepages.sh@84 -- # : 0 00:03:09.657 09:26:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:09.657 09:26:32 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:09.657 09:26:32 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:09.657 09:26:32 -- setup/hugepages.sh@160 -- # setup output 00:03:09.657 09:26:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:09.657 09:26:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:13.019 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:13.019 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:13.019 09:26:35 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:13.019 09:26:35 -- setup/hugepages.sh@89 -- # local node 00:03:13.019 09:26:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:13.019 09:26:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:13.019 09:26:35 -- setup/hugepages.sh@92 -- # local surp 00:03:13.019 09:26:35 -- setup/hugepages.sh@93 -- # local resv 00:03:13.019 09:26:35 -- setup/hugepages.sh@94 -- # local anon 00:03:13.019 09:26:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:13.019 09:26:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:13.019 09:26:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:13.019 09:26:35 -- setup/common.sh@18 -- # local node= 00:03:13.019 09:26:35 -- setup/common.sh@19 -- # local var val 00:03:13.019 09:26:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.019 09:26:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.019 09:26:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.019 09:26:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.019 09:26:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.019 09:26:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.019 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 09:26:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43645084 kB' 'MemAvailable: 45279600 kB' 'Buffers: 6816 kB' 'Cached: 9299256 kB' 'SwapCached: 180 kB' 'Active: 6729504 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822024 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592748 kB' 'Mapped: 168180 kB' 'Shmem: 7557368 kB' 'KReclaimable: 582612 kB' 'Slab: 1586316 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003704 kB' 'KernelStack: 21904 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10070264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:13.019 09:26:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.019 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.019 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.019 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.019 09:26:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.020 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.020 09:26:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:13.020 09:26:35 -- setup/common.sh@33 -- # echo 0 00:03:13.020 09:26:35 -- setup/common.sh@33 -- # return 0 00:03:13.020 09:26:35 -- setup/hugepages.sh@97 -- # anon=0 00:03:13.020 09:26:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:13.020 09:26:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.020 09:26:35 -- setup/common.sh@18 -- # local node= 00:03:13.020 09:26:35 -- setup/common.sh@19 -- # local var val 00:03:13.020 09:26:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.020 09:26:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.021 09:26:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.021 09:26:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.021 09:26:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.021 09:26:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43649208 kB' 'MemAvailable: 45283724 kB' 'Buffers: 6816 kB' 'Cached: 9299260 kB' 'SwapCached: 180 kB' 'Active: 6729056 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821576 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592368 kB' 'Mapped: 167880 kB' 'Shmem: 7557372 kB' 'KReclaimable: 582612 kB' 'Slab: 1586316 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003704 kB' 'KernelStack: 21936 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10070288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.021 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.021 09:26:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.022 09:26:35 -- setup/common.sh@33 -- # echo 0 00:03:13.022 09:26:35 -- setup/common.sh@33 -- # return 0 00:03:13.022 09:26:35 -- setup/hugepages.sh@99 -- # surp=0 00:03:13.022 09:26:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:13.022 09:26:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:13.022 09:26:35 -- setup/common.sh@18 -- # local node= 00:03:13.022 09:26:35 -- setup/common.sh@19 -- # local var val 00:03:13.022 09:26:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.022 09:26:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.022 09:26:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.022 09:26:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.022 09:26:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.022 09:26:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43649976 kB' 'MemAvailable: 45284492 kB' 'Buffers: 6816 kB' 'Cached: 9299272 kB' 'SwapCached: 180 kB' 'Active: 6729236 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821756 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592488 kB' 'Mapped: 168148 kB' 'Shmem: 7557384 kB' 'KReclaimable: 582612 kB' 'Slab: 1586436 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003824 kB' 'KernelStack: 21920 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10070300 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218020 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.022 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.022 09:26:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.023 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:13.023 09:26:35 -- setup/common.sh@33 -- # echo 0 00:03:13.023 09:26:35 -- setup/common.sh@33 -- # return 0 00:03:13.023 09:26:35 -- setup/hugepages.sh@100 -- # resv=0 00:03:13.023 09:26:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:13.023 nr_hugepages=1025 00:03:13.023 09:26:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:13.023 resv_hugepages=0 00:03:13.023 09:26:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:13.023 surplus_hugepages=0 00:03:13.023 09:26:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:13.023 anon_hugepages=0 00:03:13.023 09:26:35 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:13.023 09:26:35 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:13.023 09:26:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:13.023 09:26:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:13.023 09:26:35 -- setup/common.sh@18 -- # local node= 00:03:13.023 09:26:35 -- setup/common.sh@19 -- # local var val 00:03:13.023 09:26:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.023 09:26:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.023 09:26:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:13.023 09:26:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:13.023 09:26:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.023 09:26:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.023 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43649976 kB' 'MemAvailable: 45284492 kB' 'Buffers: 6816 kB' 'Cached: 9299288 kB' 'SwapCached: 180 kB' 'Active: 6729128 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821648 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592404 kB' 'Mapped: 168148 kB' 'Shmem: 7557400 kB' 'KReclaimable: 582612 kB' 'Slab: 1586436 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003824 kB' 'KernelStack: 21920 kB' 'PageTables: 8636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480900 kB' 'Committed_AS: 10070316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.024 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.024 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.286 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.286 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:13.287 09:26:35 -- setup/common.sh@33 -- # echo 1025 00:03:13.287 09:26:35 -- setup/common.sh@33 -- # return 0 00:03:13.287 09:26:35 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:13.287 09:26:35 -- setup/hugepages.sh@112 -- # get_nodes 00:03:13.287 09:26:35 -- setup/hugepages.sh@27 -- # local node 00:03:13.287 09:26:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:13.287 09:26:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:13.287 09:26:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:13.287 09:26:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:13.287 09:26:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:13.287 09:26:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:13.287 09:26:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:13.287 09:26:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:13.287 09:26:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:13.287 09:26:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.287 09:26:35 -- setup/common.sh@18 -- # local node=0 00:03:13.287 09:26:35 -- setup/common.sh@19 -- # local var val 00:03:13.287 09:26:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.287 09:26:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.287 09:26:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:13.287 09:26:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:13.287 09:26:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.287 09:26:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24764952 kB' 'MemUsed: 7869484 kB' 'SwapCached: 80 kB' 'Active: 3964156 kB' 'Inactive: 535356 kB' 'Active(anon): 3186592 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4220848 kB' 'Mapped: 104168 kB' 'AnonPages: 281756 kB' 'Shmem: 2908000 kB' 'KernelStack: 9912 kB' 'PageTables: 4652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876832 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 482972 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.287 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.287 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@33 -- # echo 0 00:03:13.288 09:26:35 -- setup/common.sh@33 -- # return 0 00:03:13.288 09:26:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:13.288 09:26:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:13.288 09:26:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:13.288 09:26:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:13.288 09:26:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:13.288 09:26:35 -- setup/common.sh@18 -- # local node=1 00:03:13.288 09:26:35 -- setup/common.sh@19 -- # local var val 00:03:13.288 09:26:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:13.288 09:26:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:13.288 09:26:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:13.288 09:26:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:13.288 09:26:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:13.288 09:26:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 18885840 kB' 'MemUsed: 8763520 kB' 'SwapCached: 100 kB' 'Active: 2765356 kB' 'Inactive: 2630812 kB' 'Active(anon): 2635440 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5085460 kB' 'Mapped: 63980 kB' 'AnonPages: 310988 kB' 'Shmem: 4649424 kB' 'KernelStack: 12024 kB' 'PageTables: 4032 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 188752 kB' 'Slab: 709604 kB' 'SReclaimable: 188752 kB' 'SUnreclaim: 520852 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.288 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.288 09:26:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # continue 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:13.289 09:26:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:13.289 09:26:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:13.289 09:26:35 -- setup/common.sh@33 -- # echo 0 00:03:13.289 09:26:35 -- setup/common.sh@33 -- # return 0 00:03:13.289 09:26:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:13.289 09:26:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:13.289 09:26:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:13.289 09:26:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:13.289 09:26:35 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:13.289 node0=512 expecting 513 00:03:13.289 09:26:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:13.289 09:26:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:13.289 09:26:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:13.289 09:26:35 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:13.289 node1=513 expecting 512 00:03:13.289 09:26:35 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:13.289 00:03:13.289 real 0m3.505s 00:03:13.289 user 0m1.307s 00:03:13.289 sys 0m2.259s 00:03:13.289 09:26:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:13.289 09:26:35 -- common/autotest_common.sh@10 -- # set +x 00:03:13.289 ************************************ 00:03:13.289 END TEST odd_alloc 00:03:13.289 ************************************ 00:03:13.289 09:26:35 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:13.289 09:26:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:13.289 09:26:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:13.289 09:26:35 -- common/autotest_common.sh@10 -- # set +x 00:03:13.289 ************************************ 00:03:13.289 START TEST custom_alloc 00:03:13.289 ************************************ 00:03:13.290 09:26:35 -- common/autotest_common.sh@1114 -- # custom_alloc 00:03:13.290 09:26:35 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:13.290 09:26:35 -- setup/hugepages.sh@169 -- # local node 00:03:13.290 09:26:35 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:13.290 09:26:35 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:13.290 09:26:35 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:13.290 09:26:35 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:13.290 09:26:35 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:13.290 09:26:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:13.290 09:26:35 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:13.290 09:26:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:13.290 09:26:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:13.290 09:26:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:13.290 09:26:35 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:13.290 09:26:35 -- setup/hugepages.sh@83 -- # : 256 00:03:13.290 09:26:35 -- setup/hugepages.sh@84 -- # : 1 00:03:13.290 09:26:35 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:13.290 09:26:35 -- setup/hugepages.sh@83 -- # : 0 00:03:13.290 09:26:35 -- setup/hugepages.sh@84 -- # : 0 00:03:13.290 09:26:35 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:13.290 09:26:35 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:13.290 09:26:35 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:13.290 09:26:35 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:13.290 09:26:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:13.290 09:26:35 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:13.290 09:26:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:13.290 09:26:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:13.290 09:26:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:13.290 09:26:35 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:13.290 09:26:35 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:13.290 09:26:35 -- setup/hugepages.sh@78 -- # return 0 00:03:13.290 09:26:35 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:13.290 09:26:35 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:13.290 09:26:35 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:13.290 09:26:35 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:13.290 09:26:35 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:13.290 09:26:35 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:13.290 09:26:35 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:13.290 09:26:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:13.290 09:26:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:13.290 09:26:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:13.290 09:26:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:13.290 09:26:35 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:13.290 09:26:35 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:13.290 09:26:35 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:13.290 09:26:35 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:13.290 09:26:35 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:13.290 09:26:35 -- setup/hugepages.sh@78 -- # return 0 00:03:13.290 09:26:35 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:13.290 09:26:35 -- setup/hugepages.sh@187 -- # setup output 00:03:13.290 09:26:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:13.290 09:26:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:16.588 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:16.588 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:16.588 09:26:38 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:16.588 09:26:38 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:16.588 09:26:38 -- setup/hugepages.sh@89 -- # local node 00:03:16.588 09:26:38 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:16.588 09:26:38 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:16.588 09:26:38 -- setup/hugepages.sh@92 -- # local surp 00:03:16.588 09:26:38 -- setup/hugepages.sh@93 -- # local resv 00:03:16.588 09:26:38 -- setup/hugepages.sh@94 -- # local anon 00:03:16.588 09:26:38 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:16.588 09:26:38 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:16.588 09:26:38 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:16.588 09:26:38 -- setup/common.sh@18 -- # local node= 00:03:16.588 09:26:38 -- setup/common.sh@19 -- # local var val 00:03:16.588 09:26:38 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.588 09:26:38 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.588 09:26:38 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.588 09:26:38 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.588 09:26:38 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.588 09:26:38 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.588 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.588 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42635704 kB' 'MemAvailable: 44270220 kB' 'Buffers: 6816 kB' 'Cached: 9299380 kB' 'SwapCached: 180 kB' 'Active: 6729600 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822120 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592684 kB' 'Mapped: 168252 kB' 'Shmem: 7557492 kB' 'KReclaimable: 582612 kB' 'Slab: 1585384 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002772 kB' 'KernelStack: 21920 kB' 'PageTables: 8608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10070544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218036 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:38 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:38 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.589 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.589 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:16.590 09:26:39 -- setup/common.sh@33 -- # echo 0 00:03:16.590 09:26:39 -- setup/common.sh@33 -- # return 0 00:03:16.590 09:26:39 -- setup/hugepages.sh@97 -- # anon=0 00:03:16.590 09:26:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:16.590 09:26:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.590 09:26:39 -- setup/common.sh@18 -- # local node= 00:03:16.590 09:26:39 -- setup/common.sh@19 -- # local var val 00:03:16.590 09:26:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.590 09:26:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.590 09:26:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.590 09:26:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.590 09:26:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.590 09:26:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42635848 kB' 'MemAvailable: 44270364 kB' 'Buffers: 6816 kB' 'Cached: 9299388 kB' 'SwapCached: 180 kB' 'Active: 6729256 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821776 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592332 kB' 'Mapped: 168152 kB' 'Shmem: 7557500 kB' 'KReclaimable: 582612 kB' 'Slab: 1585328 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002716 kB' 'KernelStack: 21872 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10070560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.590 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.590 09:26:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.591 09:26:39 -- setup/common.sh@33 -- # echo 0 00:03:16.591 09:26:39 -- setup/common.sh@33 -- # return 0 00:03:16.591 09:26:39 -- setup/hugepages.sh@99 -- # surp=0 00:03:16.591 09:26:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:16.591 09:26:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:16.591 09:26:39 -- setup/common.sh@18 -- # local node= 00:03:16.591 09:26:39 -- setup/common.sh@19 -- # local var val 00:03:16.591 09:26:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.591 09:26:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.591 09:26:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.591 09:26:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.591 09:26:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.591 09:26:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42636224 kB' 'MemAvailable: 44270740 kB' 'Buffers: 6816 kB' 'Cached: 9299404 kB' 'SwapCached: 180 kB' 'Active: 6729760 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822280 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592816 kB' 'Mapped: 168152 kB' 'Shmem: 7557516 kB' 'KReclaimable: 582612 kB' 'Slab: 1585328 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002716 kB' 'KernelStack: 21904 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10071080 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.591 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.591 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.592 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.592 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:16.592 09:26:39 -- setup/common.sh@33 -- # echo 0 00:03:16.592 09:26:39 -- setup/common.sh@33 -- # return 0 00:03:16.592 09:26:39 -- setup/hugepages.sh@100 -- # resv=0 00:03:16.593 09:26:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:16.593 nr_hugepages=1536 00:03:16.593 09:26:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:16.593 resv_hugepages=0 00:03:16.593 09:26:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:16.593 surplus_hugepages=0 00:03:16.593 09:26:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:16.593 anon_hugepages=0 00:03:16.593 09:26:39 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:16.593 09:26:39 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:16.593 09:26:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:16.593 09:26:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:16.593 09:26:39 -- setup/common.sh@18 -- # local node= 00:03:16.593 09:26:39 -- setup/common.sh@19 -- # local var val 00:03:16.593 09:26:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.593 09:26:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.593 09:26:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:16.593 09:26:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:16.593 09:26:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.593 09:26:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 42635476 kB' 'MemAvailable: 44269992 kB' 'Buffers: 6816 kB' 'Cached: 9299408 kB' 'SwapCached: 180 kB' 'Active: 6729456 kB' 'Inactive: 3166168 kB' 'Active(anon): 5821976 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592524 kB' 'Mapped: 168152 kB' 'Shmem: 7557520 kB' 'KReclaimable: 582612 kB' 'Slab: 1585328 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002716 kB' 'KernelStack: 21904 kB' 'PageTables: 8576 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957636 kB' 'Committed_AS: 10071096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218004 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.593 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.593 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:16.594 09:26:39 -- setup/common.sh@33 -- # echo 1536 00:03:16.594 09:26:39 -- setup/common.sh@33 -- # return 0 00:03:16.594 09:26:39 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:16.594 09:26:39 -- setup/hugepages.sh@112 -- # get_nodes 00:03:16.594 09:26:39 -- setup/hugepages.sh@27 -- # local node 00:03:16.594 09:26:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.594 09:26:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:16.594 09:26:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:16.594 09:26:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:16.594 09:26:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:16.594 09:26:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:16.594 09:26:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:16.594 09:26:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:16.594 09:26:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:16.594 09:26:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.594 09:26:39 -- setup/common.sh@18 -- # local node=0 00:03:16.594 09:26:39 -- setup/common.sh@19 -- # local var val 00:03:16.594 09:26:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.594 09:26:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.594 09:26:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:16.594 09:26:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:16.594 09:26:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.594 09:26:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 24760968 kB' 'MemUsed: 7873468 kB' 'SwapCached: 80 kB' 'Active: 3965120 kB' 'Inactive: 535356 kB' 'Active(anon): 3187556 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4220912 kB' 'Mapped: 104172 kB' 'AnonPages: 282796 kB' 'Shmem: 2908064 kB' 'KernelStack: 9944 kB' 'PageTables: 4740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876116 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 482256 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.594 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.594 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@33 -- # echo 0 00:03:16.595 09:26:39 -- setup/common.sh@33 -- # return 0 00:03:16.595 09:26:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:16.595 09:26:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:16.595 09:26:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:16.595 09:26:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:16.595 09:26:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:16.595 09:26:39 -- setup/common.sh@18 -- # local node=1 00:03:16.595 09:26:39 -- setup/common.sh@19 -- # local var val 00:03:16.595 09:26:39 -- setup/common.sh@20 -- # local mem_f mem 00:03:16.595 09:26:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:16.595 09:26:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:16.595 09:26:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:16.595 09:26:39 -- setup/common.sh@28 -- # mapfile -t mem 00:03:16.595 09:26:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:16.595 09:26:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27649360 kB' 'MemFree: 17874824 kB' 'MemUsed: 9774536 kB' 'SwapCached: 100 kB' 'Active: 2764672 kB' 'Inactive: 2630812 kB' 'Active(anon): 2634756 kB' 'Inactive(anon): 2324792 kB' 'Active(file): 129916 kB' 'Inactive(file): 306020 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5085516 kB' 'Mapped: 63980 kB' 'AnonPages: 310032 kB' 'Shmem: 4649480 kB' 'KernelStack: 11960 kB' 'PageTables: 3836 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 188752 kB' 'Slab: 709212 kB' 'SReclaimable: 188752 kB' 'SUnreclaim: 520460 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.595 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.595 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # continue 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # IFS=': ' 00:03:16.596 09:26:39 -- setup/common.sh@31 -- # read -r var val _ 00:03:16.596 09:26:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:16.596 09:26:39 -- setup/common.sh@33 -- # echo 0 00:03:16.596 09:26:39 -- setup/common.sh@33 -- # return 0 00:03:16.596 09:26:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:16.596 09:26:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:16.596 09:26:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:16.596 09:26:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:16.596 09:26:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:16.596 node0=512 expecting 512 00:03:16.596 09:26:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:16.596 09:26:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:16.596 09:26:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:16.596 09:26:39 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:16.596 node1=1024 expecting 1024 00:03:16.596 09:26:39 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:16.596 00:03:16.596 real 0m3.190s 00:03:16.596 user 0m1.043s 00:03:16.596 sys 0m2.074s 00:03:16.596 09:26:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:16.596 09:26:39 -- common/autotest_common.sh@10 -- # set +x 00:03:16.596 ************************************ 00:03:16.596 END TEST custom_alloc 00:03:16.596 ************************************ 00:03:16.596 09:26:39 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:16.596 09:26:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:16.596 09:26:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:16.596 09:26:39 -- common/autotest_common.sh@10 -- # set +x 00:03:16.596 ************************************ 00:03:16.596 START TEST no_shrink_alloc 00:03:16.596 ************************************ 00:03:16.596 09:26:39 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:03:16.596 09:26:39 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:16.596 09:26:39 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:16.596 09:26:39 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:16.596 09:26:39 -- setup/hugepages.sh@51 -- # shift 00:03:16.596 09:26:39 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:16.596 09:26:39 -- setup/hugepages.sh@52 -- # local node_ids 00:03:16.596 09:26:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:16.596 09:26:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:16.596 09:26:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:16.596 09:26:39 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:16.597 09:26:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:16.597 09:26:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:16.597 09:26:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:16.597 09:26:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:16.597 09:26:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:16.597 09:26:39 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:16.597 09:26:39 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:16.597 09:26:39 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:16.597 09:26:39 -- setup/hugepages.sh@73 -- # return 0 00:03:16.597 09:26:39 -- setup/hugepages.sh@198 -- # setup output 00:03:16.597 09:26:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:16.597 09:26:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:19.895 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:19.895 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:19.895 09:26:42 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:19.895 09:26:42 -- setup/hugepages.sh@89 -- # local node 00:03:19.895 09:26:42 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:19.895 09:26:42 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:19.895 09:26:42 -- setup/hugepages.sh@92 -- # local surp 00:03:19.895 09:26:42 -- setup/hugepages.sh@93 -- # local resv 00:03:19.895 09:26:42 -- setup/hugepages.sh@94 -- # local anon 00:03:19.895 09:26:42 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:19.895 09:26:42 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:19.895 09:26:42 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:19.895 09:26:42 -- setup/common.sh@18 -- # local node= 00:03:19.895 09:26:42 -- setup/common.sh@19 -- # local var val 00:03:19.895 09:26:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.895 09:26:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.895 09:26:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.895 09:26:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.895 09:26:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.895 09:26:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.895 09:26:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43676824 kB' 'MemAvailable: 45311340 kB' 'Buffers: 6816 kB' 'Cached: 9299516 kB' 'SwapCached: 180 kB' 'Active: 6730672 kB' 'Inactive: 3166168 kB' 'Active(anon): 5823192 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593284 kB' 'Mapped: 168172 kB' 'Shmem: 7557628 kB' 'KReclaimable: 582612 kB' 'Slab: 1585632 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003020 kB' 'KernelStack: 21936 kB' 'PageTables: 8872 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10076256 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.895 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.895 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.896 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.896 09:26:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:19.896 09:26:42 -- setup/common.sh@33 -- # echo 0 00:03:19.896 09:26:42 -- setup/common.sh@33 -- # return 0 00:03:19.896 09:26:42 -- setup/hugepages.sh@97 -- # anon=0 00:03:19.896 09:26:42 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:19.896 09:26:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:19.896 09:26:42 -- setup/common.sh@18 -- # local node= 00:03:19.896 09:26:42 -- setup/common.sh@19 -- # local var val 00:03:19.896 09:26:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.896 09:26:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.896 09:26:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.896 09:26:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.896 09:26:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.896 09:26:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.897 09:26:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43677536 kB' 'MemAvailable: 45312052 kB' 'Buffers: 6816 kB' 'Cached: 9299516 kB' 'SwapCached: 180 kB' 'Active: 6729948 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822468 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592528 kB' 'Mapped: 168120 kB' 'Shmem: 7557628 kB' 'KReclaimable: 582612 kB' 'Slab: 1585716 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003104 kB' 'KernelStack: 21888 kB' 'PageTables: 8624 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10076268 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.897 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.897 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.898 09:26:42 -- setup/common.sh@33 -- # echo 0 00:03:19.898 09:26:42 -- setup/common.sh@33 -- # return 0 00:03:19.898 09:26:42 -- setup/hugepages.sh@99 -- # surp=0 00:03:19.898 09:26:42 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:19.898 09:26:42 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:19.898 09:26:42 -- setup/common.sh@18 -- # local node= 00:03:19.898 09:26:42 -- setup/common.sh@19 -- # local var val 00:03:19.898 09:26:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.898 09:26:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.898 09:26:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.898 09:26:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.898 09:26:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.898 09:26:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43677824 kB' 'MemAvailable: 45312340 kB' 'Buffers: 6816 kB' 'Cached: 9299528 kB' 'SwapCached: 180 kB' 'Active: 6730224 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822744 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592764 kB' 'Mapped: 168160 kB' 'Shmem: 7557640 kB' 'KReclaimable: 582612 kB' 'Slab: 1585732 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003120 kB' 'KernelStack: 21872 kB' 'PageTables: 8464 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10076280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.898 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.898 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.899 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:19.899 09:26:42 -- setup/common.sh@33 -- # echo 0 00:03:19.899 09:26:42 -- setup/common.sh@33 -- # return 0 00:03:19.899 09:26:42 -- setup/hugepages.sh@100 -- # resv=0 00:03:19.899 09:26:42 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:19.899 nr_hugepages=1024 00:03:19.899 09:26:42 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:19.899 resv_hugepages=0 00:03:19.899 09:26:42 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:19.899 surplus_hugepages=0 00:03:19.899 09:26:42 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:19.899 anon_hugepages=0 00:03:19.899 09:26:42 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:19.899 09:26:42 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:19.899 09:26:42 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:19.899 09:26:42 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:19.899 09:26:42 -- setup/common.sh@18 -- # local node= 00:03:19.899 09:26:42 -- setup/common.sh@19 -- # local var val 00:03:19.899 09:26:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.899 09:26:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.899 09:26:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.899 09:26:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.899 09:26:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.899 09:26:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.899 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43675820 kB' 'MemAvailable: 45310336 kB' 'Buffers: 6816 kB' 'Cached: 9299532 kB' 'SwapCached: 180 kB' 'Active: 6730056 kB' 'Inactive: 3166168 kB' 'Active(anon): 5822576 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 592604 kB' 'Mapped: 168160 kB' 'Shmem: 7557644 kB' 'KReclaimable: 582612 kB' 'Slab: 1585732 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1003120 kB' 'KernelStack: 21856 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10076296 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218132 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.900 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.900 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:19.901 09:26:42 -- setup/common.sh@33 -- # echo 1024 00:03:19.901 09:26:42 -- setup/common.sh@33 -- # return 0 00:03:19.901 09:26:42 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:19.901 09:26:42 -- setup/hugepages.sh@112 -- # get_nodes 00:03:19.901 09:26:42 -- setup/hugepages.sh@27 -- # local node 00:03:19.901 09:26:42 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:19.901 09:26:42 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:19.901 09:26:42 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:19.901 09:26:42 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:19.901 09:26:42 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:19.901 09:26:42 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:19.901 09:26:42 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:19.901 09:26:42 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:19.901 09:26:42 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:19.901 09:26:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:19.901 09:26:42 -- setup/common.sh@18 -- # local node=0 00:03:19.901 09:26:42 -- setup/common.sh@19 -- # local var val 00:03:19.901 09:26:42 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.901 09:26:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.901 09:26:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:19.901 09:26:42 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:19.901 09:26:42 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.901 09:26:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23711040 kB' 'MemUsed: 8923396 kB' 'SwapCached: 80 kB' 'Active: 3965116 kB' 'Inactive: 535356 kB' 'Active(anon): 3187552 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4220996 kB' 'Mapped: 104180 kB' 'AnonPages: 282300 kB' 'Shmem: 2908148 kB' 'KernelStack: 9880 kB' 'PageTables: 4492 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876272 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 482412 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.901 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.901 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # continue 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.902 09:26:42 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.902 09:26:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:19.902 09:26:42 -- setup/common.sh@33 -- # echo 0 00:03:19.902 09:26:42 -- setup/common.sh@33 -- # return 0 00:03:19.902 09:26:42 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:19.902 09:26:42 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:19.902 09:26:42 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:19.902 09:26:42 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:19.902 09:26:42 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:19.902 node0=1024 expecting 1024 00:03:19.902 09:26:42 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:19.902 09:26:42 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:19.902 09:26:42 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:19.902 09:26:42 -- setup/hugepages.sh@202 -- # setup output 00:03:19.902 09:26:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:19.902 09:26:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:24.105 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:24.105 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:24.105 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:24.105 09:26:46 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:24.105 09:26:46 -- setup/hugepages.sh@89 -- # local node 00:03:24.105 09:26:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:24.105 09:26:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:24.105 09:26:46 -- setup/hugepages.sh@92 -- # local surp 00:03:24.105 09:26:46 -- setup/hugepages.sh@93 -- # local resv 00:03:24.105 09:26:46 -- setup/hugepages.sh@94 -- # local anon 00:03:24.105 09:26:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:24.105 09:26:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:24.105 09:26:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:24.105 09:26:46 -- setup/common.sh@18 -- # local node= 00:03:24.105 09:26:46 -- setup/common.sh@19 -- # local var val 00:03:24.105 09:26:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.105 09:26:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.105 09:26:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.105 09:26:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.105 09:26:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.105 09:26:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43697824 kB' 'MemAvailable: 45332340 kB' 'Buffers: 6816 kB' 'Cached: 9299640 kB' 'SwapCached: 180 kB' 'Active: 6731580 kB' 'Inactive: 3166168 kB' 'Active(anon): 5824100 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593928 kB' 'Mapped: 168256 kB' 'Shmem: 7557752 kB' 'KReclaimable: 582612 kB' 'Slab: 1585608 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002996 kB' 'KernelStack: 21904 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10072356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218100 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.105 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.105 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:24.106 09:26:46 -- setup/common.sh@33 -- # echo 0 00:03:24.106 09:26:46 -- setup/common.sh@33 -- # return 0 00:03:24.106 09:26:46 -- setup/hugepages.sh@97 -- # anon=0 00:03:24.106 09:26:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:24.106 09:26:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.106 09:26:46 -- setup/common.sh@18 -- # local node= 00:03:24.106 09:26:46 -- setup/common.sh@19 -- # local var val 00:03:24.106 09:26:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.106 09:26:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.106 09:26:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.106 09:26:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.106 09:26:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.106 09:26:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43698280 kB' 'MemAvailable: 45332796 kB' 'Buffers: 6816 kB' 'Cached: 9299644 kB' 'SwapCached: 180 kB' 'Active: 6730800 kB' 'Inactive: 3166168 kB' 'Active(anon): 5823320 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593664 kB' 'Mapped: 168172 kB' 'Shmem: 7557756 kB' 'KReclaimable: 582612 kB' 'Slab: 1585596 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002984 kB' 'KernelStack: 21904 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10072368 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.106 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.106 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.107 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.107 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.107 09:26:46 -- setup/common.sh@33 -- # echo 0 00:03:24.107 09:26:46 -- setup/common.sh@33 -- # return 0 00:03:24.107 09:26:46 -- setup/hugepages.sh@99 -- # surp=0 00:03:24.107 09:26:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:24.107 09:26:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:24.107 09:26:46 -- setup/common.sh@18 -- # local node= 00:03:24.108 09:26:46 -- setup/common.sh@19 -- # local var val 00:03:24.108 09:26:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.108 09:26:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.108 09:26:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.108 09:26:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.108 09:26:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.108 09:26:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43699056 kB' 'MemAvailable: 45333572 kB' 'Buffers: 6816 kB' 'Cached: 9299656 kB' 'SwapCached: 180 kB' 'Active: 6730824 kB' 'Inactive: 3166168 kB' 'Active(anon): 5823344 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593664 kB' 'Mapped: 168172 kB' 'Shmem: 7557768 kB' 'KReclaimable: 582612 kB' 'Slab: 1585596 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002984 kB' 'KernelStack: 21904 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10072380 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.108 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.108 09:26:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:24.109 09:26:46 -- setup/common.sh@33 -- # echo 0 00:03:24.109 09:26:46 -- setup/common.sh@33 -- # return 0 00:03:24.109 09:26:46 -- setup/hugepages.sh@100 -- # resv=0 00:03:24.109 09:26:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:24.109 nr_hugepages=1024 00:03:24.109 09:26:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:24.109 resv_hugepages=0 00:03:24.109 09:26:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:24.109 surplus_hugepages=0 00:03:24.109 09:26:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:24.109 anon_hugepages=0 00:03:24.109 09:26:46 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:24.109 09:26:46 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:24.109 09:26:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:24.109 09:26:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:24.109 09:26:46 -- setup/common.sh@18 -- # local node= 00:03:24.109 09:26:46 -- setup/common.sh@19 -- # local var val 00:03:24.109 09:26:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.109 09:26:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.109 09:26:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:24.109 09:26:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:24.109 09:26:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.109 09:26:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283796 kB' 'MemFree: 43699312 kB' 'MemAvailable: 45333828 kB' 'Buffers: 6816 kB' 'Cached: 9299672 kB' 'SwapCached: 180 kB' 'Active: 6730832 kB' 'Inactive: 3166168 kB' 'Active(anon): 5823352 kB' 'Inactive(anon): 2324944 kB' 'Active(file): 907480 kB' 'Inactive(file): 841224 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 7698940 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 593664 kB' 'Mapped: 168172 kB' 'Shmem: 7557784 kB' 'KReclaimable: 582612 kB' 'Slab: 1585596 kB' 'SReclaimable: 582612 kB' 'SUnreclaim: 1002984 kB' 'KernelStack: 21904 kB' 'PageTables: 8588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481924 kB' 'Committed_AS: 10072396 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 218068 kB' 'VmallocChunk: 0 kB' 'Percpu: 117376 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 3210612 kB' 'DirectMap2M: 37369856 kB' 'DirectMap1G: 29360128 kB' 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.109 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.109 09:26:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.110 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.110 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:24.110 09:26:46 -- setup/common.sh@33 -- # echo 1024 00:03:24.110 09:26:46 -- setup/common.sh@33 -- # return 0 00:03:24.110 09:26:46 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:24.110 09:26:46 -- setup/hugepages.sh@112 -- # get_nodes 00:03:24.111 09:26:46 -- setup/hugepages.sh@27 -- # local node 00:03:24.111 09:26:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.111 09:26:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:24.111 09:26:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:24.111 09:26:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:24.111 09:26:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:24.111 09:26:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:24.111 09:26:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:24.111 09:26:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:24.111 09:26:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:24.111 09:26:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:24.111 09:26:46 -- setup/common.sh@18 -- # local node=0 00:03:24.111 09:26:46 -- setup/common.sh@19 -- # local var val 00:03:24.111 09:26:46 -- setup/common.sh@20 -- # local mem_f mem 00:03:24.111 09:26:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:24.111 09:26:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:24.111 09:26:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:24.111 09:26:46 -- setup/common.sh@28 -- # mapfile -t mem 00:03:24.111 09:26:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:24.111 09:26:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634436 kB' 'MemFree: 23709800 kB' 'MemUsed: 8924636 kB' 'SwapCached: 80 kB' 'Active: 3966312 kB' 'Inactive: 535356 kB' 'Active(anon): 3188748 kB' 'Inactive(anon): 152 kB' 'Active(file): 777564 kB' 'Inactive(file): 535204 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 4221104 kB' 'Mapped: 104192 kB' 'AnonPages: 283876 kB' 'Shmem: 2908256 kB' 'KernelStack: 9960 kB' 'PageTables: 4792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 393860 kB' 'Slab: 876220 kB' 'SReclaimable: 393860 kB' 'SUnreclaim: 482360 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.111 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.111 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # continue 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # IFS=': ' 00:03:24.112 09:26:46 -- setup/common.sh@31 -- # read -r var val _ 00:03:24.112 09:26:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:24.112 09:26:46 -- setup/common.sh@33 -- # echo 0 00:03:24.112 09:26:46 -- setup/common.sh@33 -- # return 0 00:03:24.112 09:26:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:24.112 09:26:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:24.112 09:26:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:24.112 09:26:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:24.112 09:26:46 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:24.112 node0=1024 expecting 1024 00:03:24.112 09:26:46 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:24.112 00:03:24.112 real 0m7.171s 00:03:24.112 user 0m2.701s 00:03:24.112 sys 0m4.593s 00:03:24.112 09:26:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:24.112 09:26:46 -- common/autotest_common.sh@10 -- # set +x 00:03:24.112 ************************************ 00:03:24.112 END TEST no_shrink_alloc 00:03:24.112 ************************************ 00:03:24.112 09:26:46 -- setup/hugepages.sh@217 -- # clear_hp 00:03:24.112 09:26:46 -- setup/hugepages.sh@37 -- # local node hp 00:03:24.112 09:26:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:24.112 09:26:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.112 09:26:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:24.112 09:26:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.112 09:26:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:24.112 09:26:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:24.112 09:26:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.112 09:26:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:24.112 09:26:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:24.112 09:26:46 -- setup/hugepages.sh@41 -- # echo 0 00:03:24.112 09:26:46 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:24.112 09:26:46 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:24.112 00:03:24.112 real 0m25.699s 00:03:24.112 user 0m8.713s 00:03:24.112 sys 0m15.593s 00:03:24.112 09:26:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:24.112 09:26:46 -- common/autotest_common.sh@10 -- # set +x 00:03:24.112 ************************************ 00:03:24.112 END TEST hugepages 00:03:24.112 ************************************ 00:03:24.112 09:26:46 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:24.112 09:26:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:24.112 09:26:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:24.112 09:26:46 -- common/autotest_common.sh@10 -- # set +x 00:03:24.112 ************************************ 00:03:24.112 START TEST driver 00:03:24.112 ************************************ 00:03:24.112 09:26:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:03:24.112 * Looking for test storage... 00:03:24.112 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:24.112 09:26:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:24.112 09:26:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:24.112 09:26:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:24.112 09:26:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:24.112 09:26:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:24.112 09:26:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:24.112 09:26:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:24.112 09:26:46 -- scripts/common.sh@335 -- # IFS=.-: 00:03:24.112 09:26:46 -- scripts/common.sh@335 -- # read -ra ver1 00:03:24.112 09:26:46 -- scripts/common.sh@336 -- # IFS=.-: 00:03:24.112 09:26:46 -- scripts/common.sh@336 -- # read -ra ver2 00:03:24.112 09:26:46 -- scripts/common.sh@337 -- # local 'op=<' 00:03:24.112 09:26:46 -- scripts/common.sh@339 -- # ver1_l=2 00:03:24.112 09:26:46 -- scripts/common.sh@340 -- # ver2_l=1 00:03:24.112 09:26:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:24.112 09:26:46 -- scripts/common.sh@343 -- # case "$op" in 00:03:24.112 09:26:46 -- scripts/common.sh@344 -- # : 1 00:03:24.112 09:26:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:24.112 09:26:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:24.112 09:26:46 -- scripts/common.sh@364 -- # decimal 1 00:03:24.112 09:26:46 -- scripts/common.sh@352 -- # local d=1 00:03:24.112 09:26:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:24.112 09:26:46 -- scripts/common.sh@354 -- # echo 1 00:03:24.112 09:26:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:24.112 09:26:46 -- scripts/common.sh@365 -- # decimal 2 00:03:24.112 09:26:46 -- scripts/common.sh@352 -- # local d=2 00:03:24.112 09:26:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:24.112 09:26:46 -- scripts/common.sh@354 -- # echo 2 00:03:24.112 09:26:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:24.112 09:26:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:24.112 09:26:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:24.112 09:26:46 -- scripts/common.sh@367 -- # return 0 00:03:24.112 09:26:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:24.112 09:26:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:24.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.112 --rc genhtml_branch_coverage=1 00:03:24.112 --rc genhtml_function_coverage=1 00:03:24.112 --rc genhtml_legend=1 00:03:24.112 --rc geninfo_all_blocks=1 00:03:24.112 --rc geninfo_unexecuted_blocks=1 00:03:24.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.112 ' 00:03:24.112 09:26:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:24.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.112 --rc genhtml_branch_coverage=1 00:03:24.112 --rc genhtml_function_coverage=1 00:03:24.112 --rc genhtml_legend=1 00:03:24.112 --rc geninfo_all_blocks=1 00:03:24.112 --rc geninfo_unexecuted_blocks=1 00:03:24.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.112 ' 00:03:24.112 09:26:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:24.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.112 --rc genhtml_branch_coverage=1 00:03:24.112 --rc genhtml_function_coverage=1 00:03:24.112 --rc genhtml_legend=1 00:03:24.112 --rc geninfo_all_blocks=1 00:03:24.112 --rc geninfo_unexecuted_blocks=1 00:03:24.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.112 ' 00:03:24.112 09:26:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:24.112 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:24.112 --rc genhtml_branch_coverage=1 00:03:24.112 --rc genhtml_function_coverage=1 00:03:24.112 --rc genhtml_legend=1 00:03:24.112 --rc geninfo_all_blocks=1 00:03:24.112 --rc geninfo_unexecuted_blocks=1 00:03:24.112 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:24.112 ' 00:03:24.112 09:26:46 -- setup/driver.sh@68 -- # setup reset 00:03:24.112 09:26:46 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:24.112 09:26:46 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:29.392 09:26:51 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:03:29.392 09:26:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:29.392 09:26:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:29.392 09:26:51 -- common/autotest_common.sh@10 -- # set +x 00:03:29.392 ************************************ 00:03:29.392 START TEST guess_driver 00:03:29.392 ************************************ 00:03:29.392 09:26:51 -- common/autotest_common.sh@1114 -- # guess_driver 00:03:29.392 09:26:51 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:03:29.392 09:26:51 -- setup/driver.sh@47 -- # local fail=0 00:03:29.392 09:26:51 -- setup/driver.sh@49 -- # pick_driver 00:03:29.392 09:26:51 -- setup/driver.sh@36 -- # vfio 00:03:29.392 09:26:51 -- setup/driver.sh@21 -- # local iommu_grups 00:03:29.392 09:26:51 -- setup/driver.sh@22 -- # local unsafe_vfio 00:03:29.392 09:26:51 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:03:29.392 09:26:51 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:03:29.392 09:26:51 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:03:29.392 09:26:51 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:03:29.392 09:26:51 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:03:29.392 09:26:51 -- setup/driver.sh@14 -- # mod vfio_pci 00:03:29.392 09:26:51 -- setup/driver.sh@12 -- # dep vfio_pci 00:03:29.392 09:26:51 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:03:29.392 09:26:51 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:03:29.392 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:03:29.392 09:26:51 -- setup/driver.sh@30 -- # return 0 00:03:29.392 09:26:51 -- setup/driver.sh@37 -- # echo vfio-pci 00:03:29.392 09:26:51 -- setup/driver.sh@49 -- # driver=vfio-pci 00:03:29.392 09:26:51 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:03:29.392 09:26:51 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:03:29.392 Looking for driver=vfio-pci 00:03:29.392 09:26:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:29.392 09:26:51 -- setup/driver.sh@45 -- # setup output config 00:03:29.392 09:26:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:29.392 09:26:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:32.687 09:26:55 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:32.687 09:26:55 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:32.687 09:26:55 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:34.114 09:26:56 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:03:34.114 09:26:56 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:03:34.114 09:26:56 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:03:34.114 09:26:56 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:03:34.114 09:26:56 -- setup/driver.sh@65 -- # setup reset 00:03:34.114 09:26:56 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:34.114 09:26:56 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:39.393 00:03:39.393 real 0m9.866s 00:03:39.393 user 0m2.546s 00:03:39.393 sys 0m5.067s 00:03:39.393 09:27:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:39.393 09:27:01 -- common/autotest_common.sh@10 -- # set +x 00:03:39.393 ************************************ 00:03:39.393 END TEST guess_driver 00:03:39.393 ************************************ 00:03:39.393 00:03:39.393 real 0m15.025s 00:03:39.393 user 0m3.993s 00:03:39.393 sys 0m8.032s 00:03:39.393 09:27:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:39.393 09:27:01 -- common/autotest_common.sh@10 -- # set +x 00:03:39.393 ************************************ 00:03:39.393 END TEST driver 00:03:39.393 ************************************ 00:03:39.393 09:27:01 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:39.393 09:27:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:39.393 09:27:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:39.393 09:27:01 -- common/autotest_common.sh@10 -- # set +x 00:03:39.393 ************************************ 00:03:39.393 START TEST devices 00:03:39.393 ************************************ 00:03:39.393 09:27:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:03:39.393 * Looking for test storage... 00:03:39.393 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:39.393 09:27:01 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:39.393 09:27:01 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:39.393 09:27:01 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:39.393 09:27:01 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:39.393 09:27:01 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:39.393 09:27:01 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:39.393 09:27:01 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:39.393 09:27:01 -- scripts/common.sh@335 -- # IFS=.-: 00:03:39.393 09:27:01 -- scripts/common.sh@335 -- # read -ra ver1 00:03:39.393 09:27:01 -- scripts/common.sh@336 -- # IFS=.-: 00:03:39.393 09:27:01 -- scripts/common.sh@336 -- # read -ra ver2 00:03:39.393 09:27:01 -- scripts/common.sh@337 -- # local 'op=<' 00:03:39.393 09:27:01 -- scripts/common.sh@339 -- # ver1_l=2 00:03:39.393 09:27:01 -- scripts/common.sh@340 -- # ver2_l=1 00:03:39.393 09:27:01 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:39.393 09:27:01 -- scripts/common.sh@343 -- # case "$op" in 00:03:39.393 09:27:01 -- scripts/common.sh@344 -- # : 1 00:03:39.393 09:27:01 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:39.393 09:27:01 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:39.393 09:27:01 -- scripts/common.sh@364 -- # decimal 1 00:03:39.393 09:27:01 -- scripts/common.sh@352 -- # local d=1 00:03:39.393 09:27:01 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:39.393 09:27:01 -- scripts/common.sh@354 -- # echo 1 00:03:39.393 09:27:01 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:39.393 09:27:01 -- scripts/common.sh@365 -- # decimal 2 00:03:39.393 09:27:01 -- scripts/common.sh@352 -- # local d=2 00:03:39.393 09:27:01 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:39.393 09:27:01 -- scripts/common.sh@354 -- # echo 2 00:03:39.393 09:27:01 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:39.393 09:27:01 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:39.393 09:27:01 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:39.394 09:27:01 -- scripts/common.sh@367 -- # return 0 00:03:39.394 09:27:01 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:39.394 09:27:01 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:39.394 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.394 --rc genhtml_branch_coverage=1 00:03:39.394 --rc genhtml_function_coverage=1 00:03:39.394 --rc genhtml_legend=1 00:03:39.394 --rc geninfo_all_blocks=1 00:03:39.394 --rc geninfo_unexecuted_blocks=1 00:03:39.394 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.394 ' 00:03:39.394 09:27:01 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:39.394 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.394 --rc genhtml_branch_coverage=1 00:03:39.394 --rc genhtml_function_coverage=1 00:03:39.394 --rc genhtml_legend=1 00:03:39.394 --rc geninfo_all_blocks=1 00:03:39.394 --rc geninfo_unexecuted_blocks=1 00:03:39.394 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.394 ' 00:03:39.394 09:27:01 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:39.394 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.394 --rc genhtml_branch_coverage=1 00:03:39.394 --rc genhtml_function_coverage=1 00:03:39.394 --rc genhtml_legend=1 00:03:39.394 --rc geninfo_all_blocks=1 00:03:39.394 --rc geninfo_unexecuted_blocks=1 00:03:39.394 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.394 ' 00:03:39.394 09:27:01 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:39.394 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:39.394 --rc genhtml_branch_coverage=1 00:03:39.394 --rc genhtml_function_coverage=1 00:03:39.394 --rc genhtml_legend=1 00:03:39.394 --rc geninfo_all_blocks=1 00:03:39.394 --rc geninfo_unexecuted_blocks=1 00:03:39.394 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:39.394 ' 00:03:39.394 09:27:01 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:03:39.394 09:27:01 -- setup/devices.sh@192 -- # setup reset 00:03:39.394 09:27:01 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:39.394 09:27:01 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:42.684 09:27:05 -- setup/devices.sh@194 -- # get_zoned_devs 00:03:42.684 09:27:05 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:42.684 09:27:05 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:42.684 09:27:05 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:42.684 09:27:05 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:42.684 09:27:05 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:42.684 09:27:05 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:42.684 09:27:05 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:42.684 09:27:05 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:42.684 09:27:05 -- setup/devices.sh@196 -- # blocks=() 00:03:42.684 09:27:05 -- setup/devices.sh@196 -- # declare -a blocks 00:03:42.684 09:27:05 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:03:42.684 09:27:05 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:03:42.684 09:27:05 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:03:42.684 09:27:05 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:03:42.684 09:27:05 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:03:42.684 09:27:05 -- setup/devices.sh@201 -- # ctrl=nvme0 00:03:42.684 09:27:05 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:03:42.684 09:27:05 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:42.684 09:27:05 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:03:42.684 09:27:05 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:03:42.684 09:27:05 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:03:42.684 No valid GPT data, bailing 00:03:42.684 09:27:05 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:42.684 09:27:05 -- scripts/common.sh@393 -- # pt= 00:03:42.684 09:27:05 -- scripts/common.sh@394 -- # return 1 00:03:42.684 09:27:05 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:03:42.684 09:27:05 -- setup/common.sh@76 -- # local dev=nvme0n1 00:03:42.684 09:27:05 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:03:42.684 09:27:05 -- setup/common.sh@80 -- # echo 1600321314816 00:03:42.684 09:27:05 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:03:42.684 09:27:05 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:03:42.684 09:27:05 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:03:42.684 09:27:05 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:03:42.684 09:27:05 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:03:42.684 09:27:05 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:03:42.684 09:27:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:42.684 09:27:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:42.684 09:27:05 -- common/autotest_common.sh@10 -- # set +x 00:03:42.684 ************************************ 00:03:42.684 START TEST nvme_mount 00:03:42.684 ************************************ 00:03:42.684 09:27:05 -- common/autotest_common.sh@1114 -- # nvme_mount 00:03:42.684 09:27:05 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:03:42.685 09:27:05 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:03:42.685 09:27:05 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:42.685 09:27:05 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:42.685 09:27:05 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:03:42.685 09:27:05 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:42.685 09:27:05 -- setup/common.sh@40 -- # local part_no=1 00:03:42.685 09:27:05 -- setup/common.sh@41 -- # local size=1073741824 00:03:42.685 09:27:05 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:42.685 09:27:05 -- setup/common.sh@44 -- # parts=() 00:03:42.685 09:27:05 -- setup/common.sh@44 -- # local parts 00:03:42.685 09:27:05 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:42.685 09:27:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:42.685 09:27:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:42.685 09:27:05 -- setup/common.sh@46 -- # (( part++ )) 00:03:42.685 09:27:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:42.685 09:27:05 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:42.685 09:27:05 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:42.685 09:27:05 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:03:44.064 Creating new GPT entries in memory. 00:03:44.064 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:44.064 other utilities. 00:03:44.064 09:27:06 -- setup/common.sh@57 -- # (( part = 1 )) 00:03:44.064 09:27:06 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:44.064 09:27:06 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:44.064 09:27:06 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:44.064 09:27:06 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:45.000 Creating new GPT entries in memory. 00:03:45.000 The operation has completed successfully. 00:03:45.000 09:27:07 -- setup/common.sh@57 -- # (( part++ )) 00:03:45.000 09:27:07 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:45.000 09:27:07 -- setup/common.sh@62 -- # wait 3138077 00:03:45.000 09:27:07 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:45.000 09:27:07 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:03:45.000 09:27:07 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:45.000 09:27:07 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:03:45.000 09:27:07 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:03:45.000 09:27:07 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:45.000 09:27:07 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:45.000 09:27:07 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:45.000 09:27:07 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:03:45.000 09:27:07 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:45.000 09:27:07 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:45.000 09:27:07 -- setup/devices.sh@53 -- # local found=0 00:03:45.000 09:27:07 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:45.000 09:27:07 -- setup/devices.sh@56 -- # : 00:03:45.000 09:27:07 -- setup/devices.sh@59 -- # local pci status 00:03:45.000 09:27:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:45.000 09:27:07 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:45.000 09:27:07 -- setup/devices.sh@47 -- # setup output config 00:03:45.000 09:27:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.000 09:27:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:03:48.293 09:27:10 -- setup/devices.sh@63 -- # found=1 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:48.293 09:27:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.293 09:27:10 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:48.293 09:27:10 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:48.293 09:27:10 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:48.293 09:27:10 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:48.294 09:27:10 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:48.294 09:27:10 -- setup/devices.sh@110 -- # cleanup_nvme 00:03:48.294 09:27:10 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:48.294 09:27:10 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:48.294 09:27:10 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:48.294 09:27:10 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:03:48.294 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:48.294 09:27:10 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:48.294 09:27:10 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:48.554 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:03:48.554 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:03:48.554 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:03:48.554 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:03:48.554 09:27:11 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:03:48.554 09:27:11 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:03:48.554 09:27:11 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:48.554 09:27:11 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:03:48.554 09:27:11 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:03:48.554 09:27:11 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:48.554 09:27:11 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:48.554 09:27:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:48.554 09:27:11 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:03:48.554 09:27:11 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:48.554 09:27:11 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:48.554 09:27:11 -- setup/devices.sh@53 -- # local found=0 00:03:48.554 09:27:11 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:48.554 09:27:11 -- setup/devices.sh@56 -- # : 00:03:48.554 09:27:11 -- setup/devices.sh@59 -- # local pci status 00:03:48.554 09:27:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:48.554 09:27:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:48.554 09:27:11 -- setup/devices.sh@47 -- # setup output config 00:03:48.554 09:27:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:48.554 09:27:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:03:51.843 09:27:14 -- setup/devices.sh@63 -- # found=1 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:51.843 09:27:14 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:03:51.843 09:27:14 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:51.843 09:27:14 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:03:51.843 09:27:14 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:03:51.843 09:27:14 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:51.843 09:27:14 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:03:51.843 09:27:14 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:51.843 09:27:14 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:03:51.843 09:27:14 -- setup/devices.sh@50 -- # local mount_point= 00:03:51.843 09:27:14 -- setup/devices.sh@51 -- # local test_file= 00:03:51.843 09:27:14 -- setup/devices.sh@53 -- # local found=0 00:03:51.843 09:27:14 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:03:51.843 09:27:14 -- setup/devices.sh@59 -- # local pci status 00:03:51.843 09:27:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:51.843 09:27:14 -- setup/devices.sh@47 -- # setup output config 00:03:51.843 09:27:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:51.843 09:27:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.843 09:27:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:03:55.155 09:27:17 -- setup/devices.sh@63 -- # found=1 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:55.155 09:27:17 -- setup/devices.sh@66 -- # (( found == 1 )) 00:03:55.155 09:27:17 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:03:55.155 09:27:17 -- setup/devices.sh@68 -- # return 0 00:03:55.155 09:27:17 -- setup/devices.sh@128 -- # cleanup_nvme 00:03:55.155 09:27:17 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:03:55.155 09:27:17 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:03:55.155 09:27:17 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:03:55.155 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:03:55.155 00:03:55.155 real 0m12.147s 00:03:55.155 user 0m3.265s 00:03:55.155 sys 0m6.601s 00:03:55.155 09:27:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:55.155 09:27:17 -- common/autotest_common.sh@10 -- # set +x 00:03:55.155 ************************************ 00:03:55.155 END TEST nvme_mount 00:03:55.155 ************************************ 00:03:55.155 09:27:17 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:03:55.155 09:27:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:55.155 09:27:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:55.155 09:27:17 -- common/autotest_common.sh@10 -- # set +x 00:03:55.155 ************************************ 00:03:55.155 START TEST dm_mount 00:03:55.155 ************************************ 00:03:55.155 09:27:17 -- common/autotest_common.sh@1114 -- # dm_mount 00:03:55.155 09:27:17 -- setup/devices.sh@144 -- # pv=nvme0n1 00:03:55.155 09:27:17 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:03:55.155 09:27:17 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:03:55.155 09:27:17 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:03:55.155 09:27:17 -- setup/common.sh@39 -- # local disk=nvme0n1 00:03:55.155 09:27:17 -- setup/common.sh@40 -- # local part_no=2 00:03:55.155 09:27:17 -- setup/common.sh@41 -- # local size=1073741824 00:03:55.155 09:27:17 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:03:55.155 09:27:17 -- setup/common.sh@44 -- # parts=() 00:03:55.155 09:27:17 -- setup/common.sh@44 -- # local parts 00:03:55.155 09:27:17 -- setup/common.sh@46 -- # (( part = 1 )) 00:03:55.155 09:27:17 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:55.155 09:27:17 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:55.155 09:27:17 -- setup/common.sh@46 -- # (( part++ )) 00:03:55.155 09:27:17 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:55.155 09:27:17 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:03:55.155 09:27:17 -- setup/common.sh@46 -- # (( part++ )) 00:03:55.155 09:27:17 -- setup/common.sh@46 -- # (( part <= part_no )) 00:03:55.155 09:27:17 -- setup/common.sh@51 -- # (( size /= 512 )) 00:03:55.155 09:27:17 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:03:55.155 09:27:17 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:03:56.095 Creating new GPT entries in memory. 00:03:56.095 GPT data structures destroyed! You may now partition the disk using fdisk or 00:03:56.095 other utilities. 00:03:56.095 09:27:18 -- setup/common.sh@57 -- # (( part = 1 )) 00:03:56.095 09:27:18 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:56.095 09:27:18 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:56.095 09:27:18 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:56.095 09:27:18 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:03:57.034 Creating new GPT entries in memory. 00:03:57.034 The operation has completed successfully. 00:03:57.034 09:27:19 -- setup/common.sh@57 -- # (( part++ )) 00:03:57.034 09:27:19 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:57.034 09:27:19 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:03:57.034 09:27:19 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:03:57.034 09:27:19 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:03:57.972 The operation has completed successfully. 00:03:57.972 09:27:20 -- setup/common.sh@57 -- # (( part++ )) 00:03:57.972 09:27:20 -- setup/common.sh@57 -- # (( part <= part_no )) 00:03:57.972 09:27:20 -- setup/common.sh@62 -- # wait 3142629 00:03:58.231 09:27:20 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:03:58.231 09:27:20 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:58.231 09:27:20 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:58.231 09:27:20 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:03:58.231 09:27:20 -- setup/devices.sh@160 -- # for t in {1..5} 00:03:58.231 09:27:20 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:58.231 09:27:20 -- setup/devices.sh@161 -- # break 00:03:58.231 09:27:20 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:58.231 09:27:20 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:03:58.231 09:27:20 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:03:58.231 09:27:20 -- setup/devices.sh@166 -- # dm=dm-0 00:03:58.231 09:27:20 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:03:58.231 09:27:20 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:03:58.231 09:27:20 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:58.231 09:27:20 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:03:58.231 09:27:20 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:58.231 09:27:20 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:03:58.231 09:27:20 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:03:58.231 09:27:20 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:58.231 09:27:20 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:58.231 09:27:20 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:03:58.231 09:27:20 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:03:58.231 09:27:20 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:03:58.231 09:27:20 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:03:58.231 09:27:20 -- setup/devices.sh@53 -- # local found=0 00:03:58.231 09:27:20 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:03:58.231 09:27:20 -- setup/devices.sh@56 -- # : 00:03:58.231 09:27:20 -- setup/devices.sh@59 -- # local pci status 00:03:58.231 09:27:20 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:03:58.231 09:27:20 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:03:58.231 09:27:20 -- setup/devices.sh@47 -- # setup output config 00:03:58.231 09:27:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:58.231 09:27:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:01.524 09:27:24 -- setup/devices.sh@63 -- # found=1 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:01.524 09:27:24 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:01.524 09:27:24 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:01.524 09:27:24 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:01.524 09:27:24 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:01.524 09:27:24 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:01.524 09:27:24 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:01.524 09:27:24 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:01.524 09:27:24 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:01.524 09:27:24 -- setup/devices.sh@50 -- # local mount_point= 00:04:01.524 09:27:24 -- setup/devices.sh@51 -- # local test_file= 00:04:01.524 09:27:24 -- setup/devices.sh@53 -- # local found=0 00:04:01.524 09:27:24 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:01.524 09:27:24 -- setup/devices.sh@59 -- # local pci status 00:04:01.524 09:27:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:01.524 09:27:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:01.524 09:27:24 -- setup/devices.sh@47 -- # setup output config 00:04:01.524 09:27:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:01.524 09:27:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:04.815 09:27:27 -- setup/devices.sh@63 -- # found=1 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:04.815 09:27:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:04.815 09:27:27 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:04.815 09:27:27 -- setup/devices.sh@68 -- # return 0 00:04:04.815 09:27:27 -- setup/devices.sh@187 -- # cleanup_dm 00:04:04.815 09:27:27 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:04.815 09:27:27 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:04.815 09:27:27 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:04.815 09:27:27 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:04.815 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:04.815 09:27:27 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:04.815 00:04:04.815 real 0m9.798s 00:04:04.815 user 0m2.361s 00:04:04.815 sys 0m4.473s 00:04:04.815 09:27:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:04.815 09:27:27 -- common/autotest_common.sh@10 -- # set +x 00:04:04.815 ************************************ 00:04:04.815 END TEST dm_mount 00:04:04.815 ************************************ 00:04:04.815 09:27:27 -- setup/devices.sh@1 -- # cleanup 00:04:04.815 09:27:27 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:04.815 09:27:27 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:04.815 09:27:27 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:04.815 09:27:27 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:04.815 09:27:27 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:05.073 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:05.073 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:05.073 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:05.073 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:05.073 09:27:27 -- setup/devices.sh@12 -- # cleanup_dm 00:04:05.073 09:27:27 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:05.073 09:27:27 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:05.073 09:27:27 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:05.073 09:27:27 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:05.073 09:27:27 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:05.073 09:27:27 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:05.073 00:04:05.073 real 0m26.291s 00:04:05.073 user 0m7.159s 00:04:05.073 sys 0m13.793s 00:04:05.073 09:27:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:05.073 09:27:27 -- common/autotest_common.sh@10 -- # set +x 00:04:05.073 ************************************ 00:04:05.073 END TEST devices 00:04:05.073 ************************************ 00:04:05.073 00:04:05.073 real 1m31.816s 00:04:05.073 user 0m27.808s 00:04:05.073 sys 0m52.642s 00:04:05.073 09:27:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:05.073 09:27:27 -- common/autotest_common.sh@10 -- # set +x 00:04:05.073 ************************************ 00:04:05.073 END TEST setup.sh 00:04:05.073 ************************************ 00:04:05.073 09:27:27 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:08.365 Hugepages 00:04:08.365 node hugesize free / total 00:04:08.365 node0 1048576kB 0 / 0 00:04:08.365 node0 2048kB 2048 / 2048 00:04:08.365 node1 1048576kB 0 / 0 00:04:08.365 node1 2048kB 0 / 0 00:04:08.365 00:04:08.365 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:08.365 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:08.365 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:08.365 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:08.365 09:27:31 -- spdk/autotest.sh@128 -- # uname -s 00:04:08.365 09:27:31 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:04:08.365 09:27:31 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:04:08.365 09:27:31 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:11.653 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:11.653 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:13.033 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:13.033 09:27:35 -- common/autotest_common.sh@1527 -- # sleep 1 00:04:13.971 09:27:36 -- common/autotest_common.sh@1528 -- # bdfs=() 00:04:13.971 09:27:36 -- common/autotest_common.sh@1528 -- # local bdfs 00:04:13.971 09:27:36 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:04:13.971 09:27:36 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:04:13.971 09:27:36 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:13.971 09:27:36 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:13.971 09:27:36 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:13.971 09:27:36 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:13.971 09:27:36 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:14.230 09:27:36 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:14.230 09:27:36 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:14.230 09:27:36 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:17.665 Waiting for block devices as requested 00:04:17.665 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:17.665 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:17.665 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:17.665 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:17.665 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:17.924 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:17.924 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:17.924 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:18.183 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:18.183 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:18.183 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:18.442 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:18.442 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:18.442 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:18.701 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:18.701 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:18.701 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:04:18.961 09:27:41 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:04:18.961 09:27:41 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:04:18.961 09:27:41 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:04:18.961 09:27:41 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:04:18.961 09:27:41 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:18.961 09:27:41 -- common/autotest_common.sh@1540 -- # grep oacs 00:04:18.961 09:27:41 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:04:18.961 09:27:41 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:04:18.961 09:27:41 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:04:18.961 09:27:41 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:04:18.961 09:27:41 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:04:18.961 09:27:41 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:04:18.961 09:27:41 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:04:18.961 09:27:41 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:04:18.961 09:27:41 -- common/autotest_common.sh@1552 -- # continue 00:04:18.961 09:27:41 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:04:18.961 09:27:41 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:18.962 09:27:41 -- common/autotest_common.sh@10 -- # set +x 00:04:18.962 09:27:41 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:04:18.962 09:27:41 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:18.962 09:27:41 -- common/autotest_common.sh@10 -- # set +x 00:04:18.962 09:27:41 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:22.253 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:22.253 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:23.632 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:04:23.891 09:27:46 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:04:23.891 09:27:46 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:23.891 09:27:46 -- common/autotest_common.sh@10 -- # set +x 00:04:23.891 09:27:46 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:04:23.891 09:27:46 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:04:23.891 09:27:46 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:04:23.891 09:27:46 -- common/autotest_common.sh@1572 -- # bdfs=() 00:04:23.891 09:27:46 -- common/autotest_common.sh@1572 -- # local bdfs 00:04:23.891 09:27:46 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:04:23.891 09:27:46 -- common/autotest_common.sh@1508 -- # bdfs=() 00:04:23.891 09:27:46 -- common/autotest_common.sh@1508 -- # local bdfs 00:04:23.891 09:27:46 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:23.891 09:27:46 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:23.891 09:27:46 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:04:23.891 09:27:46 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:04:23.891 09:27:46 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:04:23.891 09:27:46 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:04:23.891 09:27:46 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:04:23.891 09:27:46 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:04:23.891 09:27:46 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:23.891 09:27:46 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:04:23.891 09:27:46 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:04:23.891 09:27:46 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:04:23.891 09:27:46 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=3152553 00:04:23.891 09:27:46 -- common/autotest_common.sh@1593 -- # waitforlisten 3152553 00:04:23.891 09:27:46 -- common/autotest_common.sh@829 -- # '[' -z 3152553 ']' 00:04:23.891 09:27:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:23.891 09:27:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:23.891 09:27:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:23.891 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:23.891 09:27:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:23.891 09:27:46 -- common/autotest_common.sh@10 -- # set +x 00:04:23.891 09:27:46 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:24.151 [2024-11-29 09:27:46.753915] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:24.151 [2024-11-29 09:27:46.753980] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3152553 ] 00:04:24.151 EAL: No free 2048 kB hugepages reported on node 1 00:04:24.151 [2024-11-29 09:27:46.821991] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:24.151 [2024-11-29 09:27:46.894856] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:24.151 [2024-11-29 09:27:46.894982] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:25.087 09:27:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:25.087 09:27:47 -- common/autotest_common.sh@862 -- # return 0 00:04:25.087 09:27:47 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:04:25.087 09:27:47 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:04:25.087 09:27:47 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:04:28.374 nvme0n1 00:04:28.374 09:27:50 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:04:28.374 [2024-11-29 09:27:50.734650] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:04:28.374 request: 00:04:28.374 { 00:04:28.374 "nvme_ctrlr_name": "nvme0", 00:04:28.374 "password": "test", 00:04:28.374 "method": "bdev_nvme_opal_revert", 00:04:28.374 "req_id": 1 00:04:28.374 } 00:04:28.374 Got JSON-RPC error response 00:04:28.374 response: 00:04:28.374 { 00:04:28.374 "code": -32602, 00:04:28.374 "message": "Invalid parameters" 00:04:28.374 } 00:04:28.374 09:27:50 -- common/autotest_common.sh@1599 -- # true 00:04:28.374 09:27:50 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:04:28.374 09:27:50 -- common/autotest_common.sh@1603 -- # killprocess 3152553 00:04:28.374 09:27:50 -- common/autotest_common.sh@936 -- # '[' -z 3152553 ']' 00:04:28.374 09:27:50 -- common/autotest_common.sh@940 -- # kill -0 3152553 00:04:28.374 09:27:50 -- common/autotest_common.sh@941 -- # uname 00:04:28.374 09:27:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:28.374 09:27:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3152553 00:04:28.374 09:27:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:28.374 09:27:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:28.374 09:27:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3152553' 00:04:28.374 killing process with pid 3152553 00:04:28.374 09:27:50 -- common/autotest_common.sh@955 -- # kill 3152553 00:04:28.374 09:27:50 -- common/autotest_common.sh@960 -- # wait 3152553 00:04:30.277 09:27:52 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:04:30.277 09:27:52 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:04:30.277 09:27:52 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:04:30.277 09:27:52 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:04:30.277 09:27:52 -- spdk/autotest.sh@160 -- # timing_enter lib 00:04:30.277 09:27:52 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:30.277 09:27:52 -- common/autotest_common.sh@10 -- # set +x 00:04:30.277 09:27:52 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:30.277 09:27:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:30.277 09:27:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:30.277 09:27:52 -- common/autotest_common.sh@10 -- # set +x 00:04:30.277 ************************************ 00:04:30.277 START TEST env 00:04:30.277 ************************************ 00:04:30.277 09:27:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:04:30.277 * Looking for test storage... 00:04:30.277 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:04:30.277 09:27:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:30.277 09:27:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:30.277 09:27:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:30.537 09:27:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:30.537 09:27:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:30.537 09:27:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:30.537 09:27:53 -- scripts/common.sh@335 -- # IFS=.-: 00:04:30.537 09:27:53 -- scripts/common.sh@335 -- # read -ra ver1 00:04:30.537 09:27:53 -- scripts/common.sh@336 -- # IFS=.-: 00:04:30.537 09:27:53 -- scripts/common.sh@336 -- # read -ra ver2 00:04:30.537 09:27:53 -- scripts/common.sh@337 -- # local 'op=<' 00:04:30.537 09:27:53 -- scripts/common.sh@339 -- # ver1_l=2 00:04:30.537 09:27:53 -- scripts/common.sh@340 -- # ver2_l=1 00:04:30.537 09:27:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:30.537 09:27:53 -- scripts/common.sh@343 -- # case "$op" in 00:04:30.537 09:27:53 -- scripts/common.sh@344 -- # : 1 00:04:30.537 09:27:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:30.537 09:27:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:30.537 09:27:53 -- scripts/common.sh@364 -- # decimal 1 00:04:30.537 09:27:53 -- scripts/common.sh@352 -- # local d=1 00:04:30.537 09:27:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:30.537 09:27:53 -- scripts/common.sh@354 -- # echo 1 00:04:30.537 09:27:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:30.537 09:27:53 -- scripts/common.sh@365 -- # decimal 2 00:04:30.537 09:27:53 -- scripts/common.sh@352 -- # local d=2 00:04:30.537 09:27:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:30.537 09:27:53 -- scripts/common.sh@354 -- # echo 2 00:04:30.537 09:27:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:30.537 09:27:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:30.537 09:27:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:30.537 09:27:53 -- scripts/common.sh@367 -- # return 0 00:04:30.537 09:27:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:30.537 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:30.537 --rc genhtml_branch_coverage=1 00:04:30.537 --rc genhtml_function_coverage=1 00:04:30.537 --rc genhtml_legend=1 00:04:30.537 --rc geninfo_all_blocks=1 00:04:30.537 --rc geninfo_unexecuted_blocks=1 00:04:30.537 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:30.537 ' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:30.537 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:30.537 --rc genhtml_branch_coverage=1 00:04:30.537 --rc genhtml_function_coverage=1 00:04:30.537 --rc genhtml_legend=1 00:04:30.537 --rc geninfo_all_blocks=1 00:04:30.537 --rc geninfo_unexecuted_blocks=1 00:04:30.537 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:30.537 ' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:30.537 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:30.537 --rc genhtml_branch_coverage=1 00:04:30.537 --rc genhtml_function_coverage=1 00:04:30.537 --rc genhtml_legend=1 00:04:30.537 --rc geninfo_all_blocks=1 00:04:30.537 --rc geninfo_unexecuted_blocks=1 00:04:30.537 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:30.537 ' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:30.537 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:30.537 --rc genhtml_branch_coverage=1 00:04:30.537 --rc genhtml_function_coverage=1 00:04:30.537 --rc genhtml_legend=1 00:04:30.537 --rc geninfo_all_blocks=1 00:04:30.537 --rc geninfo_unexecuted_blocks=1 00:04:30.537 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:30.537 ' 00:04:30.537 09:27:53 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:30.537 09:27:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:30.537 09:27:53 -- common/autotest_common.sh@10 -- # set +x 00:04:30.537 ************************************ 00:04:30.537 START TEST env_memory 00:04:30.537 ************************************ 00:04:30.537 09:27:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:04:30.537 00:04:30.537 00:04:30.537 CUnit - A unit testing framework for C - Version 2.1-3 00:04:30.537 http://cunit.sourceforge.net/ 00:04:30.537 00:04:30.537 00:04:30.537 Suite: memory 00:04:30.537 Test: alloc and free memory map ...[2024-11-29 09:27:53.218318] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:30.537 passed 00:04:30.537 Test: mem map translation ...[2024-11-29 09:27:53.231322] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:30.537 [2024-11-29 09:27:53.231342] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:30.537 [2024-11-29 09:27:53.231373] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:30.537 [2024-11-29 09:27:53.231382] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:30.537 passed 00:04:30.537 Test: mem map registration ...[2024-11-29 09:27:53.252014] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:04:30.537 [2024-11-29 09:27:53.252038] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:04:30.537 passed 00:04:30.537 Test: mem map adjacent registrations ...passed 00:04:30.537 00:04:30.537 Run Summary: Type Total Ran Passed Failed Inactive 00:04:30.537 suites 1 1 n/a 0 0 00:04:30.537 tests 4 4 4 0 0 00:04:30.537 asserts 152 152 152 0 n/a 00:04:30.537 00:04:30.537 Elapsed time = 0.084 seconds 00:04:30.537 00:04:30.537 real 0m0.097s 00:04:30.537 user 0m0.086s 00:04:30.537 sys 0m0.011s 00:04:30.537 09:27:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:30.537 09:27:53 -- common/autotest_common.sh@10 -- # set +x 00:04:30.537 ************************************ 00:04:30.537 END TEST env_memory 00:04:30.537 ************************************ 00:04:30.537 09:27:53 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:30.537 09:27:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:30.537 09:27:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:30.537 09:27:53 -- common/autotest_common.sh@10 -- # set +x 00:04:30.537 ************************************ 00:04:30.537 START TEST env_vtophys 00:04:30.537 ************************************ 00:04:30.537 09:27:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:04:30.537 EAL: lib.eal log level changed from notice to debug 00:04:30.537 EAL: Detected lcore 0 as core 0 on socket 0 00:04:30.537 EAL: Detected lcore 1 as core 1 on socket 0 00:04:30.537 EAL: Detected lcore 2 as core 2 on socket 0 00:04:30.537 EAL: Detected lcore 3 as core 3 on socket 0 00:04:30.537 EAL: Detected lcore 4 as core 4 on socket 0 00:04:30.537 EAL: Detected lcore 5 as core 5 on socket 0 00:04:30.537 EAL: Detected lcore 6 as core 6 on socket 0 00:04:30.537 EAL: Detected lcore 7 as core 8 on socket 0 00:04:30.537 EAL: Detected lcore 8 as core 9 on socket 0 00:04:30.537 EAL: Detected lcore 9 as core 10 on socket 0 00:04:30.537 EAL: Detected lcore 10 as core 11 on socket 0 00:04:30.537 EAL: Detected lcore 11 as core 12 on socket 0 00:04:30.537 EAL: Detected lcore 12 as core 13 on socket 0 00:04:30.537 EAL: Detected lcore 13 as core 14 on socket 0 00:04:30.537 EAL: Detected lcore 14 as core 16 on socket 0 00:04:30.537 EAL: Detected lcore 15 as core 17 on socket 0 00:04:30.537 EAL: Detected lcore 16 as core 18 on socket 0 00:04:30.537 EAL: Detected lcore 17 as core 19 on socket 0 00:04:30.537 EAL: Detected lcore 18 as core 20 on socket 0 00:04:30.537 EAL: Detected lcore 19 as core 21 on socket 0 00:04:30.537 EAL: Detected lcore 20 as core 22 on socket 0 00:04:30.537 EAL: Detected lcore 21 as core 24 on socket 0 00:04:30.537 EAL: Detected lcore 22 as core 25 on socket 0 00:04:30.537 EAL: Detected lcore 23 as core 26 on socket 0 00:04:30.537 EAL: Detected lcore 24 as core 27 on socket 0 00:04:30.537 EAL: Detected lcore 25 as core 28 on socket 0 00:04:30.537 EAL: Detected lcore 26 as core 29 on socket 0 00:04:30.537 EAL: Detected lcore 27 as core 30 on socket 0 00:04:30.537 EAL: Detected lcore 28 as core 0 on socket 1 00:04:30.537 EAL: Detected lcore 29 as core 1 on socket 1 00:04:30.537 EAL: Detected lcore 30 as core 2 on socket 1 00:04:30.537 EAL: Detected lcore 31 as core 3 on socket 1 00:04:30.537 EAL: Detected lcore 32 as core 4 on socket 1 00:04:30.537 EAL: Detected lcore 33 as core 5 on socket 1 00:04:30.537 EAL: Detected lcore 34 as core 6 on socket 1 00:04:30.537 EAL: Detected lcore 35 as core 8 on socket 1 00:04:30.537 EAL: Detected lcore 36 as core 9 on socket 1 00:04:30.538 EAL: Detected lcore 37 as core 10 on socket 1 00:04:30.538 EAL: Detected lcore 38 as core 11 on socket 1 00:04:30.538 EAL: Detected lcore 39 as core 12 on socket 1 00:04:30.538 EAL: Detected lcore 40 as core 13 on socket 1 00:04:30.538 EAL: Detected lcore 41 as core 14 on socket 1 00:04:30.538 EAL: Detected lcore 42 as core 16 on socket 1 00:04:30.538 EAL: Detected lcore 43 as core 17 on socket 1 00:04:30.538 EAL: Detected lcore 44 as core 18 on socket 1 00:04:30.538 EAL: Detected lcore 45 as core 19 on socket 1 00:04:30.538 EAL: Detected lcore 46 as core 20 on socket 1 00:04:30.538 EAL: Detected lcore 47 as core 21 on socket 1 00:04:30.538 EAL: Detected lcore 48 as core 22 on socket 1 00:04:30.538 EAL: Detected lcore 49 as core 24 on socket 1 00:04:30.538 EAL: Detected lcore 50 as core 25 on socket 1 00:04:30.538 EAL: Detected lcore 51 as core 26 on socket 1 00:04:30.538 EAL: Detected lcore 52 as core 27 on socket 1 00:04:30.538 EAL: Detected lcore 53 as core 28 on socket 1 00:04:30.538 EAL: Detected lcore 54 as core 29 on socket 1 00:04:30.538 EAL: Detected lcore 55 as core 30 on socket 1 00:04:30.538 EAL: Detected lcore 56 as core 0 on socket 0 00:04:30.538 EAL: Detected lcore 57 as core 1 on socket 0 00:04:30.538 EAL: Detected lcore 58 as core 2 on socket 0 00:04:30.538 EAL: Detected lcore 59 as core 3 on socket 0 00:04:30.538 EAL: Detected lcore 60 as core 4 on socket 0 00:04:30.538 EAL: Detected lcore 61 as core 5 on socket 0 00:04:30.538 EAL: Detected lcore 62 as core 6 on socket 0 00:04:30.538 EAL: Detected lcore 63 as core 8 on socket 0 00:04:30.538 EAL: Detected lcore 64 as core 9 on socket 0 00:04:30.538 EAL: Detected lcore 65 as core 10 on socket 0 00:04:30.538 EAL: Detected lcore 66 as core 11 on socket 0 00:04:30.538 EAL: Detected lcore 67 as core 12 on socket 0 00:04:30.538 EAL: Detected lcore 68 as core 13 on socket 0 00:04:30.538 EAL: Detected lcore 69 as core 14 on socket 0 00:04:30.538 EAL: Detected lcore 70 as core 16 on socket 0 00:04:30.538 EAL: Detected lcore 71 as core 17 on socket 0 00:04:30.538 EAL: Detected lcore 72 as core 18 on socket 0 00:04:30.538 EAL: Detected lcore 73 as core 19 on socket 0 00:04:30.538 EAL: Detected lcore 74 as core 20 on socket 0 00:04:30.538 EAL: Detected lcore 75 as core 21 on socket 0 00:04:30.538 EAL: Detected lcore 76 as core 22 on socket 0 00:04:30.538 EAL: Detected lcore 77 as core 24 on socket 0 00:04:30.538 EAL: Detected lcore 78 as core 25 on socket 0 00:04:30.538 EAL: Detected lcore 79 as core 26 on socket 0 00:04:30.538 EAL: Detected lcore 80 as core 27 on socket 0 00:04:30.538 EAL: Detected lcore 81 as core 28 on socket 0 00:04:30.538 EAL: Detected lcore 82 as core 29 on socket 0 00:04:30.538 EAL: Detected lcore 83 as core 30 on socket 0 00:04:30.538 EAL: Detected lcore 84 as core 0 on socket 1 00:04:30.538 EAL: Detected lcore 85 as core 1 on socket 1 00:04:30.538 EAL: Detected lcore 86 as core 2 on socket 1 00:04:30.538 EAL: Detected lcore 87 as core 3 on socket 1 00:04:30.538 EAL: Detected lcore 88 as core 4 on socket 1 00:04:30.538 EAL: Detected lcore 89 as core 5 on socket 1 00:04:30.538 EAL: Detected lcore 90 as core 6 on socket 1 00:04:30.538 EAL: Detected lcore 91 as core 8 on socket 1 00:04:30.538 EAL: Detected lcore 92 as core 9 on socket 1 00:04:30.538 EAL: Detected lcore 93 as core 10 on socket 1 00:04:30.538 EAL: Detected lcore 94 as core 11 on socket 1 00:04:30.538 EAL: Detected lcore 95 as core 12 on socket 1 00:04:30.538 EAL: Detected lcore 96 as core 13 on socket 1 00:04:30.538 EAL: Detected lcore 97 as core 14 on socket 1 00:04:30.538 EAL: Detected lcore 98 as core 16 on socket 1 00:04:30.538 EAL: Detected lcore 99 as core 17 on socket 1 00:04:30.538 EAL: Detected lcore 100 as core 18 on socket 1 00:04:30.538 EAL: Detected lcore 101 as core 19 on socket 1 00:04:30.538 EAL: Detected lcore 102 as core 20 on socket 1 00:04:30.538 EAL: Detected lcore 103 as core 21 on socket 1 00:04:30.538 EAL: Detected lcore 104 as core 22 on socket 1 00:04:30.538 EAL: Detected lcore 105 as core 24 on socket 1 00:04:30.538 EAL: Detected lcore 106 as core 25 on socket 1 00:04:30.538 EAL: Detected lcore 107 as core 26 on socket 1 00:04:30.538 EAL: Detected lcore 108 as core 27 on socket 1 00:04:30.538 EAL: Detected lcore 109 as core 28 on socket 1 00:04:30.538 EAL: Detected lcore 110 as core 29 on socket 1 00:04:30.538 EAL: Detected lcore 111 as core 30 on socket 1 00:04:30.538 EAL: Maximum logical cores by configuration: 128 00:04:30.538 EAL: Detected CPU lcores: 112 00:04:30.538 EAL: Detected NUMA nodes: 2 00:04:30.538 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:30.538 EAL: Checking presence of .so 'librte_eal.so.24' 00:04:30.538 EAL: Checking presence of .so 'librte_eal.so' 00:04:30.538 EAL: Detected static linkage of DPDK 00:04:30.538 EAL: No shared files mode enabled, IPC will be disabled 00:04:30.798 EAL: Bus pci wants IOVA as 'DC' 00:04:30.798 EAL: Buses did not request a specific IOVA mode. 00:04:30.798 EAL: IOMMU is available, selecting IOVA as VA mode. 00:04:30.798 EAL: Selected IOVA mode 'VA' 00:04:30.798 EAL: No free 2048 kB hugepages reported on node 1 00:04:30.798 EAL: Probing VFIO support... 00:04:30.798 EAL: IOMMU type 1 (Type 1) is supported 00:04:30.798 EAL: IOMMU type 7 (sPAPR) is not supported 00:04:30.798 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:04:30.798 EAL: VFIO support initialized 00:04:30.798 EAL: Ask a virtual area of 0x2e000 bytes 00:04:30.798 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:30.798 EAL: Setting up physically contiguous memory... 00:04:30.798 EAL: Setting maximum number of open files to 524288 00:04:30.798 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:30.798 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:04:30.798 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:30.798 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:04:30.798 EAL: Ask a virtual area of 0x61000 bytes 00:04:30.798 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:04:30.798 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:04:30.798 EAL: Ask a virtual area of 0x400000000 bytes 00:04:30.798 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:04:30.798 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:04:30.798 EAL: Hugepages will be freed exactly as allocated. 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: TSC frequency is ~2500000 KHz 00:04:30.798 EAL: Main lcore 0 is ready (tid=7fc8f367ea00;cpuset=[0]) 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 0 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 2MB 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Mem event callback 'spdk:(nil)' registered 00:04:30.798 00:04:30.798 00:04:30.798 CUnit - A unit testing framework for C - Version 2.1-3 00:04:30.798 http://cunit.sourceforge.net/ 00:04:30.798 00:04:30.798 00:04:30.798 Suite: components_suite 00:04:30.798 Test: vtophys_malloc_test ...passed 00:04:30.798 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 4MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 4MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 6MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 6MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 10MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 10MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 18MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 18MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 34MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 34MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 66MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 66MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 130MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was shrunk by 130MB 00:04:30.798 EAL: Trying to obtain current memory policy. 00:04:30.798 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:30.798 EAL: Restoring previous memory policy: 4 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:30.798 EAL: request: mp_malloc_sync 00:04:30.798 EAL: No shared files mode enabled, IPC is disabled 00:04:30.798 EAL: Heap on socket 0 was expanded by 258MB 00:04:30.798 EAL: Calling mem event callback 'spdk:(nil)' 00:04:31.058 EAL: request: mp_malloc_sync 00:04:31.058 EAL: No shared files mode enabled, IPC is disabled 00:04:31.058 EAL: Heap on socket 0 was shrunk by 258MB 00:04:31.058 EAL: Trying to obtain current memory policy. 00:04:31.058 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:31.058 EAL: Restoring previous memory policy: 4 00:04:31.058 EAL: Calling mem event callback 'spdk:(nil)' 00:04:31.058 EAL: request: mp_malloc_sync 00:04:31.058 EAL: No shared files mode enabled, IPC is disabled 00:04:31.058 EAL: Heap on socket 0 was expanded by 514MB 00:04:31.058 EAL: Calling mem event callback 'spdk:(nil)' 00:04:31.316 EAL: request: mp_malloc_sync 00:04:31.316 EAL: No shared files mode enabled, IPC is disabled 00:04:31.316 EAL: Heap on socket 0 was shrunk by 514MB 00:04:31.316 EAL: Trying to obtain current memory policy. 00:04:31.316 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:31.316 EAL: Restoring previous memory policy: 4 00:04:31.316 EAL: Calling mem event callback 'spdk:(nil)' 00:04:31.316 EAL: request: mp_malloc_sync 00:04:31.316 EAL: No shared files mode enabled, IPC is disabled 00:04:31.316 EAL: Heap on socket 0 was expanded by 1026MB 00:04:31.575 EAL: Calling mem event callback 'spdk:(nil)' 00:04:31.834 EAL: request: mp_malloc_sync 00:04:31.834 EAL: No shared files mode enabled, IPC is disabled 00:04:31.834 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:31.834 passed 00:04:31.834 00:04:31.834 Run Summary: Type Total Ran Passed Failed Inactive 00:04:31.834 suites 1 1 n/a 0 0 00:04:31.834 tests 2 2 2 0 0 00:04:31.834 asserts 497 497 497 0 n/a 00:04:31.834 00:04:31.834 Elapsed time = 0.967 seconds 00:04:31.834 EAL: Calling mem event callback 'spdk:(nil)' 00:04:31.834 EAL: request: mp_malloc_sync 00:04:31.834 EAL: No shared files mode enabled, IPC is disabled 00:04:31.834 EAL: Heap on socket 0 was shrunk by 2MB 00:04:31.834 EAL: No shared files mode enabled, IPC is disabled 00:04:31.834 EAL: No shared files mode enabled, IPC is disabled 00:04:31.834 EAL: No shared files mode enabled, IPC is disabled 00:04:31.834 00:04:31.834 real 0m1.095s 00:04:31.834 user 0m0.631s 00:04:31.834 sys 0m0.431s 00:04:31.834 09:27:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:31.834 09:27:54 -- common/autotest_common.sh@10 -- # set +x 00:04:31.834 ************************************ 00:04:31.834 END TEST env_vtophys 00:04:31.834 ************************************ 00:04:31.834 09:27:54 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:31.834 09:27:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:31.834 09:27:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:31.834 09:27:54 -- common/autotest_common.sh@10 -- # set +x 00:04:31.834 ************************************ 00:04:31.834 START TEST env_pci 00:04:31.834 ************************************ 00:04:31.834 09:27:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:04:31.834 00:04:31.834 00:04:31.834 CUnit - A unit testing framework for C - Version 2.1-3 00:04:31.834 http://cunit.sourceforge.net/ 00:04:31.834 00:04:31.834 00:04:31.834 Suite: pci 00:04:31.834 Test: pci_hook ...[2024-11-29 09:27:54.484672] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3154057 has claimed it 00:04:31.834 EAL: Cannot find device (10000:00:01.0) 00:04:31.834 EAL: Failed to attach device on primary process 00:04:31.834 passed 00:04:31.834 00:04:31.834 Run Summary: Type Total Ran Passed Failed Inactive 00:04:31.834 suites 1 1 n/a 0 0 00:04:31.834 tests 1 1 1 0 0 00:04:31.834 asserts 25 25 25 0 n/a 00:04:31.834 00:04:31.834 Elapsed time = 0.031 seconds 00:04:31.834 00:04:31.834 real 0m0.048s 00:04:31.834 user 0m0.010s 00:04:31.834 sys 0m0.038s 00:04:31.834 09:27:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:31.834 09:27:54 -- common/autotest_common.sh@10 -- # set +x 00:04:31.834 ************************************ 00:04:31.834 END TEST env_pci 00:04:31.834 ************************************ 00:04:31.834 09:27:54 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:31.834 09:27:54 -- env/env.sh@15 -- # uname 00:04:31.834 09:27:54 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:31.834 09:27:54 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:31.834 09:27:54 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:31.834 09:27:54 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:04:31.834 09:27:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:31.834 09:27:54 -- common/autotest_common.sh@10 -- # set +x 00:04:31.834 ************************************ 00:04:31.834 START TEST env_dpdk_post_init 00:04:31.834 ************************************ 00:04:31.834 09:27:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:31.834 EAL: Detected CPU lcores: 112 00:04:31.834 EAL: Detected NUMA nodes: 2 00:04:31.834 EAL: Detected static linkage of DPDK 00:04:31.834 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:31.834 EAL: Selected IOVA mode 'VA' 00:04:31.834 EAL: No free 2048 kB hugepages reported on node 1 00:04:31.834 EAL: VFIO support initialized 00:04:31.834 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:32.094 EAL: Using IOMMU type 1 (Type 1) 00:04:32.661 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:04:36.852 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:04:36.852 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:04:36.852 Starting DPDK initialization... 00:04:36.852 Starting SPDK post initialization... 00:04:36.852 SPDK NVMe probe 00:04:36.852 Attaching to 0000:d8:00.0 00:04:36.852 Attached to 0000:d8:00.0 00:04:36.852 Cleaning up... 00:04:36.852 00:04:36.852 real 0m4.735s 00:04:36.852 user 0m3.596s 00:04:36.852 sys 0m0.386s 00:04:36.852 09:27:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.852 09:27:59 -- common/autotest_common.sh@10 -- # set +x 00:04:36.852 ************************************ 00:04:36.852 END TEST env_dpdk_post_init 00:04:36.852 ************************************ 00:04:36.852 09:27:59 -- env/env.sh@26 -- # uname 00:04:36.852 09:27:59 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:36.852 09:27:59 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:36.852 09:27:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:36.853 09:27:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:36.853 09:27:59 -- common/autotest_common.sh@10 -- # set +x 00:04:36.853 ************************************ 00:04:36.853 START TEST env_mem_callbacks 00:04:36.853 ************************************ 00:04:36.853 09:27:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:04:36.853 EAL: Detected CPU lcores: 112 00:04:36.853 EAL: Detected NUMA nodes: 2 00:04:36.853 EAL: Detected static linkage of DPDK 00:04:36.853 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:36.853 EAL: Selected IOVA mode 'VA' 00:04:36.853 EAL: No free 2048 kB hugepages reported on node 1 00:04:36.853 EAL: VFIO support initialized 00:04:36.853 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:36.853 00:04:36.853 00:04:36.853 CUnit - A unit testing framework for C - Version 2.1-3 00:04:36.853 http://cunit.sourceforge.net/ 00:04:36.853 00:04:36.853 00:04:36.853 Suite: memory 00:04:36.853 Test: test ... 00:04:36.853 register 0x200000200000 2097152 00:04:36.853 malloc 3145728 00:04:36.853 register 0x200000400000 4194304 00:04:36.853 buf 0x200000500000 len 3145728 PASSED 00:04:36.853 malloc 64 00:04:36.853 buf 0x2000004fff40 len 64 PASSED 00:04:36.853 malloc 4194304 00:04:36.853 register 0x200000800000 6291456 00:04:36.853 buf 0x200000a00000 len 4194304 PASSED 00:04:36.853 free 0x200000500000 3145728 00:04:36.853 free 0x2000004fff40 64 00:04:36.853 unregister 0x200000400000 4194304 PASSED 00:04:36.853 free 0x200000a00000 4194304 00:04:36.853 unregister 0x200000800000 6291456 PASSED 00:04:36.853 malloc 8388608 00:04:36.853 register 0x200000400000 10485760 00:04:36.853 buf 0x200000600000 len 8388608 PASSED 00:04:36.853 free 0x200000600000 8388608 00:04:36.853 unregister 0x200000400000 10485760 PASSED 00:04:36.853 passed 00:04:36.853 00:04:36.853 Run Summary: Type Total Ran Passed Failed Inactive 00:04:36.853 suites 1 1 n/a 0 0 00:04:36.853 tests 1 1 1 0 0 00:04:36.853 asserts 15 15 15 0 n/a 00:04:36.853 00:04:36.853 Elapsed time = 0.005 seconds 00:04:36.853 00:04:36.853 real 0m0.065s 00:04:36.853 user 0m0.020s 00:04:36.853 sys 0m0.045s 00:04:36.853 09:27:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.853 09:27:59 -- common/autotest_common.sh@10 -- # set +x 00:04:36.853 ************************************ 00:04:36.853 END TEST env_mem_callbacks 00:04:36.853 ************************************ 00:04:36.853 00:04:36.853 real 0m6.480s 00:04:36.853 user 0m4.533s 00:04:36.853 sys 0m1.217s 00:04:36.853 09:27:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:36.853 09:27:59 -- common/autotest_common.sh@10 -- # set +x 00:04:36.853 ************************************ 00:04:36.853 END TEST env 00:04:36.853 ************************************ 00:04:36.853 09:27:59 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:36.853 09:27:59 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:36.853 09:27:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:36.853 09:27:59 -- common/autotest_common.sh@10 -- # set +x 00:04:36.853 ************************************ 00:04:36.853 START TEST rpc 00:04:36.853 ************************************ 00:04:36.853 09:27:59 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:04:36.853 * Looking for test storage... 00:04:36.853 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:36.853 09:27:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:36.853 09:27:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:36.853 09:27:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:36.853 09:27:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:36.853 09:27:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:36.853 09:27:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:36.853 09:27:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:36.853 09:27:59 -- scripts/common.sh@335 -- # IFS=.-: 00:04:36.853 09:27:59 -- scripts/common.sh@335 -- # read -ra ver1 00:04:36.853 09:27:59 -- scripts/common.sh@336 -- # IFS=.-: 00:04:36.853 09:27:59 -- scripts/common.sh@336 -- # read -ra ver2 00:04:36.853 09:27:59 -- scripts/common.sh@337 -- # local 'op=<' 00:04:36.853 09:27:59 -- scripts/common.sh@339 -- # ver1_l=2 00:04:36.853 09:27:59 -- scripts/common.sh@340 -- # ver2_l=1 00:04:36.853 09:27:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:36.853 09:27:59 -- scripts/common.sh@343 -- # case "$op" in 00:04:36.853 09:27:59 -- scripts/common.sh@344 -- # : 1 00:04:36.853 09:27:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:36.853 09:27:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:37.113 09:27:59 -- scripts/common.sh@364 -- # decimal 1 00:04:37.113 09:27:59 -- scripts/common.sh@352 -- # local d=1 00:04:37.113 09:27:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:37.113 09:27:59 -- scripts/common.sh@354 -- # echo 1 00:04:37.113 09:27:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:37.113 09:27:59 -- scripts/common.sh@365 -- # decimal 2 00:04:37.113 09:27:59 -- scripts/common.sh@352 -- # local d=2 00:04:37.113 09:27:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:37.113 09:27:59 -- scripts/common.sh@354 -- # echo 2 00:04:37.113 09:27:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:37.113 09:27:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:37.113 09:27:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:37.113 09:27:59 -- scripts/common.sh@367 -- # return 0 00:04:37.113 09:27:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:37.113 09:27:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:37.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.113 --rc genhtml_branch_coverage=1 00:04:37.113 --rc genhtml_function_coverage=1 00:04:37.113 --rc genhtml_legend=1 00:04:37.113 --rc geninfo_all_blocks=1 00:04:37.113 --rc geninfo_unexecuted_blocks=1 00:04:37.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:37.113 ' 00:04:37.113 09:27:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:37.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.113 --rc genhtml_branch_coverage=1 00:04:37.113 --rc genhtml_function_coverage=1 00:04:37.113 --rc genhtml_legend=1 00:04:37.113 --rc geninfo_all_blocks=1 00:04:37.113 --rc geninfo_unexecuted_blocks=1 00:04:37.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:37.113 ' 00:04:37.113 09:27:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:37.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.113 --rc genhtml_branch_coverage=1 00:04:37.113 --rc genhtml_function_coverage=1 00:04:37.113 --rc genhtml_legend=1 00:04:37.113 --rc geninfo_all_blocks=1 00:04:37.113 --rc geninfo_unexecuted_blocks=1 00:04:37.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:37.113 ' 00:04:37.113 09:27:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:37.113 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:37.113 --rc genhtml_branch_coverage=1 00:04:37.113 --rc genhtml_function_coverage=1 00:04:37.113 --rc genhtml_legend=1 00:04:37.113 --rc geninfo_all_blocks=1 00:04:37.113 --rc geninfo_unexecuted_blocks=1 00:04:37.113 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:37.113 ' 00:04:37.113 09:27:59 -- rpc/rpc.sh@65 -- # spdk_pid=3155039 00:04:37.113 09:27:59 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:04:37.114 09:27:59 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:04:37.114 09:27:59 -- rpc/rpc.sh@67 -- # waitforlisten 3155039 00:04:37.114 09:27:59 -- common/autotest_common.sh@829 -- # '[' -z 3155039 ']' 00:04:37.114 09:27:59 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:37.114 09:27:59 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:37.114 09:27:59 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:37.114 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:37.114 09:27:59 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:37.114 09:27:59 -- common/autotest_common.sh@10 -- # set +x 00:04:37.114 [2024-11-29 09:27:59.734329] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:37.114 [2024-11-29 09:27:59.734398] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3155039 ] 00:04:37.114 EAL: No free 2048 kB hugepages reported on node 1 00:04:37.114 [2024-11-29 09:27:59.802909] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:37.114 [2024-11-29 09:27:59.873089] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:37.114 [2024-11-29 09:27:59.873218] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:04:37.114 [2024-11-29 09:27:59.873229] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3155039' to capture a snapshot of events at runtime. 00:04:37.114 [2024-11-29 09:27:59.873238] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3155039 for offline analysis/debug. 00:04:37.114 [2024-11-29 09:27:59.873259] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:38.053 09:28:00 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:38.053 09:28:00 -- common/autotest_common.sh@862 -- # return 0 00:04:38.053 09:28:00 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:38.053 09:28:00 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:04:38.053 09:28:00 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:04:38.053 09:28:00 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:04:38.053 09:28:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:38.053 09:28:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 ************************************ 00:04:38.053 START TEST rpc_integrity 00:04:38.053 ************************************ 00:04:38.053 09:28:00 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:04:38.053 09:28:00 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.053 09:28:00 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:38.053 09:28:00 -- rpc/rpc.sh@13 -- # jq length 00:04:38.053 09:28:00 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:38.053 09:28:00 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.053 09:28:00 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:04:38.053 09:28:00 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.053 09:28:00 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:38.053 { 00:04:38.053 "name": "Malloc0", 00:04:38.053 "aliases": [ 00:04:38.053 "9ac6faa6-2cd1-4635-9bdf-68778e71a0e6" 00:04:38.053 ], 00:04:38.053 "product_name": "Malloc disk", 00:04:38.053 "block_size": 512, 00:04:38.053 "num_blocks": 16384, 00:04:38.053 "uuid": "9ac6faa6-2cd1-4635-9bdf-68778e71a0e6", 00:04:38.053 "assigned_rate_limits": { 00:04:38.053 "rw_ios_per_sec": 0, 00:04:38.053 "rw_mbytes_per_sec": 0, 00:04:38.053 "r_mbytes_per_sec": 0, 00:04:38.053 "w_mbytes_per_sec": 0 00:04:38.053 }, 00:04:38.053 "claimed": false, 00:04:38.053 "zoned": false, 00:04:38.053 "supported_io_types": { 00:04:38.053 "read": true, 00:04:38.053 "write": true, 00:04:38.053 "unmap": true, 00:04:38.053 "write_zeroes": true, 00:04:38.053 "flush": true, 00:04:38.053 "reset": true, 00:04:38.053 "compare": false, 00:04:38.053 "compare_and_write": false, 00:04:38.053 "abort": true, 00:04:38.053 "nvme_admin": false, 00:04:38.053 "nvme_io": false 00:04:38.053 }, 00:04:38.053 "memory_domains": [ 00:04:38.053 { 00:04:38.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.053 "dma_device_type": 2 00:04:38.053 } 00:04:38.053 ], 00:04:38.053 "driver_specific": {} 00:04:38.053 } 00:04:38.053 ]' 00:04:38.053 09:28:00 -- rpc/rpc.sh@17 -- # jq length 00:04:38.053 09:28:00 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:38.053 09:28:00 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 [2024-11-29 09:28:00.685334] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:04:38.053 [2024-11-29 09:28:00.685371] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:38.053 [2024-11-29 09:28:00.685398] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5806030 00:04:38.053 [2024-11-29 09:28:00.685409] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:38.053 [2024-11-29 09:28:00.686252] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:38.053 [2024-11-29 09:28:00.686276] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:38.053 Passthru0 00:04:38.053 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.053 09:28:00 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.053 09:28:00 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:38.053 { 00:04:38.053 "name": "Malloc0", 00:04:38.053 "aliases": [ 00:04:38.053 "9ac6faa6-2cd1-4635-9bdf-68778e71a0e6" 00:04:38.053 ], 00:04:38.053 "product_name": "Malloc disk", 00:04:38.053 "block_size": 512, 00:04:38.053 "num_blocks": 16384, 00:04:38.053 "uuid": "9ac6faa6-2cd1-4635-9bdf-68778e71a0e6", 00:04:38.053 "assigned_rate_limits": { 00:04:38.053 "rw_ios_per_sec": 0, 00:04:38.053 "rw_mbytes_per_sec": 0, 00:04:38.053 "r_mbytes_per_sec": 0, 00:04:38.053 "w_mbytes_per_sec": 0 00:04:38.053 }, 00:04:38.053 "claimed": true, 00:04:38.053 "claim_type": "exclusive_write", 00:04:38.053 "zoned": false, 00:04:38.053 "supported_io_types": { 00:04:38.053 "read": true, 00:04:38.053 "write": true, 00:04:38.053 "unmap": true, 00:04:38.053 "write_zeroes": true, 00:04:38.053 "flush": true, 00:04:38.053 "reset": true, 00:04:38.053 "compare": false, 00:04:38.053 "compare_and_write": false, 00:04:38.053 "abort": true, 00:04:38.053 "nvme_admin": false, 00:04:38.053 "nvme_io": false 00:04:38.053 }, 00:04:38.053 "memory_domains": [ 00:04:38.053 { 00:04:38.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.053 "dma_device_type": 2 00:04:38.053 } 00:04:38.053 ], 00:04:38.053 "driver_specific": {} 00:04:38.053 }, 00:04:38.053 { 00:04:38.053 "name": "Passthru0", 00:04:38.053 "aliases": [ 00:04:38.053 "ffb07a95-cac4-5910-81a7-d7ab8c7d765e" 00:04:38.053 ], 00:04:38.053 "product_name": "passthru", 00:04:38.053 "block_size": 512, 00:04:38.053 "num_blocks": 16384, 00:04:38.053 "uuid": "ffb07a95-cac4-5910-81a7-d7ab8c7d765e", 00:04:38.053 "assigned_rate_limits": { 00:04:38.053 "rw_ios_per_sec": 0, 00:04:38.053 "rw_mbytes_per_sec": 0, 00:04:38.053 "r_mbytes_per_sec": 0, 00:04:38.053 "w_mbytes_per_sec": 0 00:04:38.053 }, 00:04:38.053 "claimed": false, 00:04:38.053 "zoned": false, 00:04:38.053 "supported_io_types": { 00:04:38.053 "read": true, 00:04:38.053 "write": true, 00:04:38.053 "unmap": true, 00:04:38.053 "write_zeroes": true, 00:04:38.053 "flush": true, 00:04:38.053 "reset": true, 00:04:38.053 "compare": false, 00:04:38.053 "compare_and_write": false, 00:04:38.053 "abort": true, 00:04:38.053 "nvme_admin": false, 00:04:38.053 "nvme_io": false 00:04:38.053 }, 00:04:38.053 "memory_domains": [ 00:04:38.053 { 00:04:38.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.053 "dma_device_type": 2 00:04:38.053 } 00:04:38.053 ], 00:04:38.053 "driver_specific": { 00:04:38.053 "passthru": { 00:04:38.053 "name": "Passthru0", 00:04:38.053 "base_bdev_name": "Malloc0" 00:04:38.053 } 00:04:38.053 } 00:04:38.053 } 00:04:38.053 ]' 00:04:38.053 09:28:00 -- rpc/rpc.sh@21 -- # jq length 00:04:38.053 09:28:00 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:38.053 09:28:00 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.053 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.053 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.053 09:28:00 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:04:38.053 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.054 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.054 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.054 09:28:00 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:38.054 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.054 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.054 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.054 09:28:00 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:38.054 09:28:00 -- rpc/rpc.sh@26 -- # jq length 00:04:38.054 09:28:00 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:38.054 00:04:38.054 real 0m0.275s 00:04:38.054 user 0m0.170s 00:04:38.054 sys 0m0.040s 00:04:38.054 09:28:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:38.054 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.054 ************************************ 00:04:38.054 END TEST rpc_integrity 00:04:38.054 ************************************ 00:04:38.054 09:28:00 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:04:38.054 09:28:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:38.054 09:28:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:38.054 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.054 ************************************ 00:04:38.054 START TEST rpc_plugins 00:04:38.054 ************************************ 00:04:38.054 09:28:00 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:04:38.054 09:28:00 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:04:38.054 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.054 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.314 09:28:00 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:04:38.314 09:28:00 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:04:38.314 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.314 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.314 09:28:00 -- rpc/rpc.sh@31 -- # bdevs='[ 00:04:38.314 { 00:04:38.314 "name": "Malloc1", 00:04:38.314 "aliases": [ 00:04:38.314 "c844c251-2553-4246-8f6b-750b4b16941b" 00:04:38.314 ], 00:04:38.314 "product_name": "Malloc disk", 00:04:38.314 "block_size": 4096, 00:04:38.314 "num_blocks": 256, 00:04:38.314 "uuid": "c844c251-2553-4246-8f6b-750b4b16941b", 00:04:38.314 "assigned_rate_limits": { 00:04:38.314 "rw_ios_per_sec": 0, 00:04:38.314 "rw_mbytes_per_sec": 0, 00:04:38.314 "r_mbytes_per_sec": 0, 00:04:38.314 "w_mbytes_per_sec": 0 00:04:38.314 }, 00:04:38.314 "claimed": false, 00:04:38.314 "zoned": false, 00:04:38.314 "supported_io_types": { 00:04:38.314 "read": true, 00:04:38.314 "write": true, 00:04:38.314 "unmap": true, 00:04:38.314 "write_zeroes": true, 00:04:38.314 "flush": true, 00:04:38.314 "reset": true, 00:04:38.314 "compare": false, 00:04:38.314 "compare_and_write": false, 00:04:38.314 "abort": true, 00:04:38.314 "nvme_admin": false, 00:04:38.314 "nvme_io": false 00:04:38.314 }, 00:04:38.314 "memory_domains": [ 00:04:38.314 { 00:04:38.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.314 "dma_device_type": 2 00:04:38.314 } 00:04:38.314 ], 00:04:38.314 "driver_specific": {} 00:04:38.314 } 00:04:38.314 ]' 00:04:38.314 09:28:00 -- rpc/rpc.sh@32 -- # jq length 00:04:38.314 09:28:00 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:04:38.314 09:28:00 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:04:38.314 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.314 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.314 09:28:00 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:04:38.314 09:28:00 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.314 09:28:00 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 09:28:00 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.314 09:28:00 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:04:38.314 09:28:00 -- rpc/rpc.sh@36 -- # jq length 00:04:38.314 09:28:01 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:04:38.314 00:04:38.314 real 0m0.140s 00:04:38.314 user 0m0.080s 00:04:38.314 sys 0m0.025s 00:04:38.314 09:28:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:38.314 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 ************************************ 00:04:38.314 END TEST rpc_plugins 00:04:38.314 ************************************ 00:04:38.314 09:28:01 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:04:38.314 09:28:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:38.314 09:28:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:38.314 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 ************************************ 00:04:38.314 START TEST rpc_trace_cmd_test 00:04:38.314 ************************************ 00:04:38.314 09:28:01 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:04:38.314 09:28:01 -- rpc/rpc.sh@40 -- # local info 00:04:38.314 09:28:01 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:04:38.314 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.314 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.314 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.314 09:28:01 -- rpc/rpc.sh@42 -- # info='{ 00:04:38.314 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3155039", 00:04:38.314 "tpoint_group_mask": "0x8", 00:04:38.314 "iscsi_conn": { 00:04:38.314 "mask": "0x2", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "scsi": { 00:04:38.314 "mask": "0x4", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "bdev": { 00:04:38.314 "mask": "0x8", 00:04:38.314 "tpoint_mask": "0xffffffffffffffff" 00:04:38.314 }, 00:04:38.314 "nvmf_rdma": { 00:04:38.314 "mask": "0x10", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "nvmf_tcp": { 00:04:38.314 "mask": "0x20", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "ftl": { 00:04:38.314 "mask": "0x40", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "blobfs": { 00:04:38.314 "mask": "0x80", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "dsa": { 00:04:38.314 "mask": "0x200", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "thread": { 00:04:38.314 "mask": "0x400", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "nvme_pcie": { 00:04:38.314 "mask": "0x800", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "iaa": { 00:04:38.314 "mask": "0x1000", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "nvme_tcp": { 00:04:38.314 "mask": "0x2000", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 }, 00:04:38.314 "bdev_nvme": { 00:04:38.314 "mask": "0x4000", 00:04:38.314 "tpoint_mask": "0x0" 00:04:38.314 } 00:04:38.314 }' 00:04:38.314 09:28:01 -- rpc/rpc.sh@43 -- # jq length 00:04:38.314 09:28:01 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:04:38.314 09:28:01 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:04:38.574 09:28:01 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:04:38.574 09:28:01 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:04:38.574 09:28:01 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:04:38.574 09:28:01 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:04:38.574 09:28:01 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:04:38.574 09:28:01 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:04:38.574 09:28:01 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:04:38.574 00:04:38.574 real 0m0.208s 00:04:38.574 user 0m0.163s 00:04:38.574 sys 0m0.037s 00:04:38.574 09:28:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:38.574 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.574 ************************************ 00:04:38.574 END TEST rpc_trace_cmd_test 00:04:38.574 ************************************ 00:04:38.574 09:28:01 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:04:38.574 09:28:01 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:04:38.574 09:28:01 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:04:38.574 09:28:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:38.574 09:28:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:38.574 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.574 ************************************ 00:04:38.574 START TEST rpc_daemon_integrity 00:04:38.574 ************************************ 00:04:38.574 09:28:01 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:04:38.574 09:28:01 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:04:38.574 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.574 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.574 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.574 09:28:01 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:04:38.574 09:28:01 -- rpc/rpc.sh@13 -- # jq length 00:04:38.574 09:28:01 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:04:38.574 09:28:01 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:04:38.574 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.574 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.574 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.574 09:28:01 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:04:38.574 09:28:01 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:04:38.574 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.574 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.574 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.574 09:28:01 -- rpc/rpc.sh@16 -- # bdevs='[ 00:04:38.574 { 00:04:38.574 "name": "Malloc2", 00:04:38.574 "aliases": [ 00:04:38.574 "b3bc65f0-af29-4816-af67-ce7daf44ba42" 00:04:38.574 ], 00:04:38.574 "product_name": "Malloc disk", 00:04:38.574 "block_size": 512, 00:04:38.574 "num_blocks": 16384, 00:04:38.574 "uuid": "b3bc65f0-af29-4816-af67-ce7daf44ba42", 00:04:38.574 "assigned_rate_limits": { 00:04:38.574 "rw_ios_per_sec": 0, 00:04:38.574 "rw_mbytes_per_sec": 0, 00:04:38.574 "r_mbytes_per_sec": 0, 00:04:38.574 "w_mbytes_per_sec": 0 00:04:38.574 }, 00:04:38.574 "claimed": false, 00:04:38.574 "zoned": false, 00:04:38.574 "supported_io_types": { 00:04:38.574 "read": true, 00:04:38.574 "write": true, 00:04:38.574 "unmap": true, 00:04:38.574 "write_zeroes": true, 00:04:38.574 "flush": true, 00:04:38.574 "reset": true, 00:04:38.574 "compare": false, 00:04:38.574 "compare_and_write": false, 00:04:38.574 "abort": true, 00:04:38.574 "nvme_admin": false, 00:04:38.574 "nvme_io": false 00:04:38.574 }, 00:04:38.574 "memory_domains": [ 00:04:38.574 { 00:04:38.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.574 "dma_device_type": 2 00:04:38.574 } 00:04:38.574 ], 00:04:38.574 "driver_specific": {} 00:04:38.574 } 00:04:38.574 ]' 00:04:38.574 09:28:01 -- rpc/rpc.sh@17 -- # jq length 00:04:38.834 09:28:01 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:04:38.834 09:28:01 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:04:38.834 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.834 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.834 [2024-11-29 09:28:01.455345] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:04:38.834 [2024-11-29 09:28:01.455377] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:04:38.834 [2024-11-29 09:28:01.455396] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x598f980 00:04:38.834 [2024-11-29 09:28:01.455406] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:04:38.834 [2024-11-29 09:28:01.456102] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:04:38.834 [2024-11-29 09:28:01.456121] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:04:38.834 Passthru0 00:04:38.834 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.834 09:28:01 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:04:38.834 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.834 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.834 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.834 09:28:01 -- rpc/rpc.sh@20 -- # bdevs='[ 00:04:38.834 { 00:04:38.834 "name": "Malloc2", 00:04:38.834 "aliases": [ 00:04:38.834 "b3bc65f0-af29-4816-af67-ce7daf44ba42" 00:04:38.834 ], 00:04:38.834 "product_name": "Malloc disk", 00:04:38.834 "block_size": 512, 00:04:38.834 "num_blocks": 16384, 00:04:38.834 "uuid": "b3bc65f0-af29-4816-af67-ce7daf44ba42", 00:04:38.834 "assigned_rate_limits": { 00:04:38.834 "rw_ios_per_sec": 0, 00:04:38.834 "rw_mbytes_per_sec": 0, 00:04:38.834 "r_mbytes_per_sec": 0, 00:04:38.834 "w_mbytes_per_sec": 0 00:04:38.834 }, 00:04:38.834 "claimed": true, 00:04:38.834 "claim_type": "exclusive_write", 00:04:38.834 "zoned": false, 00:04:38.834 "supported_io_types": { 00:04:38.834 "read": true, 00:04:38.834 "write": true, 00:04:38.834 "unmap": true, 00:04:38.834 "write_zeroes": true, 00:04:38.834 "flush": true, 00:04:38.834 "reset": true, 00:04:38.834 "compare": false, 00:04:38.834 "compare_and_write": false, 00:04:38.834 "abort": true, 00:04:38.834 "nvme_admin": false, 00:04:38.834 "nvme_io": false 00:04:38.834 }, 00:04:38.834 "memory_domains": [ 00:04:38.834 { 00:04:38.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.834 "dma_device_type": 2 00:04:38.834 } 00:04:38.834 ], 00:04:38.834 "driver_specific": {} 00:04:38.834 }, 00:04:38.834 { 00:04:38.834 "name": "Passthru0", 00:04:38.834 "aliases": [ 00:04:38.834 "1ef1906c-3ad6-52fa-af62-cd1e20e2704d" 00:04:38.834 ], 00:04:38.834 "product_name": "passthru", 00:04:38.834 "block_size": 512, 00:04:38.834 "num_blocks": 16384, 00:04:38.834 "uuid": "1ef1906c-3ad6-52fa-af62-cd1e20e2704d", 00:04:38.834 "assigned_rate_limits": { 00:04:38.834 "rw_ios_per_sec": 0, 00:04:38.834 "rw_mbytes_per_sec": 0, 00:04:38.834 "r_mbytes_per_sec": 0, 00:04:38.834 "w_mbytes_per_sec": 0 00:04:38.834 }, 00:04:38.834 "claimed": false, 00:04:38.834 "zoned": false, 00:04:38.834 "supported_io_types": { 00:04:38.834 "read": true, 00:04:38.834 "write": true, 00:04:38.834 "unmap": true, 00:04:38.834 "write_zeroes": true, 00:04:38.834 "flush": true, 00:04:38.834 "reset": true, 00:04:38.834 "compare": false, 00:04:38.834 "compare_and_write": false, 00:04:38.834 "abort": true, 00:04:38.834 "nvme_admin": false, 00:04:38.834 "nvme_io": false 00:04:38.834 }, 00:04:38.834 "memory_domains": [ 00:04:38.834 { 00:04:38.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:04:38.834 "dma_device_type": 2 00:04:38.834 } 00:04:38.834 ], 00:04:38.834 "driver_specific": { 00:04:38.834 "passthru": { 00:04:38.834 "name": "Passthru0", 00:04:38.834 "base_bdev_name": "Malloc2" 00:04:38.834 } 00:04:38.834 } 00:04:38.834 } 00:04:38.834 ]' 00:04:38.834 09:28:01 -- rpc/rpc.sh@21 -- # jq length 00:04:38.834 09:28:01 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:04:38.834 09:28:01 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:04:38.834 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.834 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.834 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.834 09:28:01 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:04:38.834 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.834 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.834 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.834 09:28:01 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:04:38.834 09:28:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:38.834 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.834 09:28:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:38.834 09:28:01 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:04:38.834 09:28:01 -- rpc/rpc.sh@26 -- # jq length 00:04:38.834 09:28:01 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:04:38.834 00:04:38.834 real 0m0.283s 00:04:38.834 user 0m0.172s 00:04:38.834 sys 0m0.046s 00:04:38.834 09:28:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:38.834 09:28:01 -- common/autotest_common.sh@10 -- # set +x 00:04:38.834 ************************************ 00:04:38.834 END TEST rpc_daemon_integrity 00:04:38.834 ************************************ 00:04:38.834 09:28:01 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:04:38.834 09:28:01 -- rpc/rpc.sh@84 -- # killprocess 3155039 00:04:38.834 09:28:01 -- common/autotest_common.sh@936 -- # '[' -z 3155039 ']' 00:04:38.834 09:28:01 -- common/autotest_common.sh@940 -- # kill -0 3155039 00:04:38.834 09:28:01 -- common/autotest_common.sh@941 -- # uname 00:04:38.834 09:28:01 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:38.834 09:28:01 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3155039 00:04:39.094 09:28:01 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:39.094 09:28:01 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:39.094 09:28:01 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3155039' 00:04:39.094 killing process with pid 3155039 00:04:39.094 09:28:01 -- common/autotest_common.sh@955 -- # kill 3155039 00:04:39.094 09:28:01 -- common/autotest_common.sh@960 -- # wait 3155039 00:04:39.353 00:04:39.353 real 0m2.504s 00:04:39.353 user 0m3.096s 00:04:39.353 sys 0m0.762s 00:04:39.353 09:28:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:39.353 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.353 ************************************ 00:04:39.353 END TEST rpc 00:04:39.353 ************************************ 00:04:39.353 09:28:02 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:39.353 09:28:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:39.353 09:28:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:39.353 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.353 ************************************ 00:04:39.353 START TEST rpc_client 00:04:39.353 ************************************ 00:04:39.353 09:28:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:04:39.353 * Looking for test storage... 00:04:39.353 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:04:39.353 09:28:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:39.353 09:28:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:39.353 09:28:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:39.613 09:28:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:39.613 09:28:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:39.613 09:28:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:39.613 09:28:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:39.613 09:28:02 -- scripts/common.sh@335 -- # IFS=.-: 00:04:39.613 09:28:02 -- scripts/common.sh@335 -- # read -ra ver1 00:04:39.613 09:28:02 -- scripts/common.sh@336 -- # IFS=.-: 00:04:39.613 09:28:02 -- scripts/common.sh@336 -- # read -ra ver2 00:04:39.613 09:28:02 -- scripts/common.sh@337 -- # local 'op=<' 00:04:39.613 09:28:02 -- scripts/common.sh@339 -- # ver1_l=2 00:04:39.613 09:28:02 -- scripts/common.sh@340 -- # ver2_l=1 00:04:39.613 09:28:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:39.613 09:28:02 -- scripts/common.sh@343 -- # case "$op" in 00:04:39.613 09:28:02 -- scripts/common.sh@344 -- # : 1 00:04:39.613 09:28:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:39.613 09:28:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:39.613 09:28:02 -- scripts/common.sh@364 -- # decimal 1 00:04:39.613 09:28:02 -- scripts/common.sh@352 -- # local d=1 00:04:39.613 09:28:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:39.613 09:28:02 -- scripts/common.sh@354 -- # echo 1 00:04:39.613 09:28:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:39.613 09:28:02 -- scripts/common.sh@365 -- # decimal 2 00:04:39.613 09:28:02 -- scripts/common.sh@352 -- # local d=2 00:04:39.613 09:28:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:39.613 09:28:02 -- scripts/common.sh@354 -- # echo 2 00:04:39.613 09:28:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:39.613 09:28:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:39.613 09:28:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:39.613 09:28:02 -- scripts/common.sh@367 -- # return 0 00:04:39.613 09:28:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:39.613 09:28:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:39.613 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.613 --rc genhtml_branch_coverage=1 00:04:39.613 --rc genhtml_function_coverage=1 00:04:39.613 --rc genhtml_legend=1 00:04:39.613 --rc geninfo_all_blocks=1 00:04:39.613 --rc geninfo_unexecuted_blocks=1 00:04:39.613 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.613 ' 00:04:39.613 09:28:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:39.613 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.613 --rc genhtml_branch_coverage=1 00:04:39.613 --rc genhtml_function_coverage=1 00:04:39.613 --rc genhtml_legend=1 00:04:39.613 --rc geninfo_all_blocks=1 00:04:39.613 --rc geninfo_unexecuted_blocks=1 00:04:39.613 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.613 ' 00:04:39.613 09:28:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:39.613 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.613 --rc genhtml_branch_coverage=1 00:04:39.613 --rc genhtml_function_coverage=1 00:04:39.613 --rc genhtml_legend=1 00:04:39.613 --rc geninfo_all_blocks=1 00:04:39.613 --rc geninfo_unexecuted_blocks=1 00:04:39.613 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.613 ' 00:04:39.613 09:28:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:39.613 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.613 --rc genhtml_branch_coverage=1 00:04:39.613 --rc genhtml_function_coverage=1 00:04:39.613 --rc genhtml_legend=1 00:04:39.613 --rc geninfo_all_blocks=1 00:04:39.613 --rc geninfo_unexecuted_blocks=1 00:04:39.613 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.613 ' 00:04:39.613 09:28:02 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:04:39.613 OK 00:04:39.613 09:28:02 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:04:39.613 00:04:39.613 real 0m0.168s 00:04:39.613 user 0m0.091s 00:04:39.613 sys 0m0.089s 00:04:39.613 09:28:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:39.613 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.613 ************************************ 00:04:39.613 END TEST rpc_client 00:04:39.613 ************************************ 00:04:39.613 09:28:02 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:04:39.613 09:28:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:39.613 09:28:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:39.613 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.613 ************************************ 00:04:39.613 START TEST json_config 00:04:39.613 ************************************ 00:04:39.614 09:28:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:04:39.614 09:28:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:39.614 09:28:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:39.614 09:28:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:39.614 09:28:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:39.614 09:28:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:39.614 09:28:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:39.614 09:28:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:39.614 09:28:02 -- scripts/common.sh@335 -- # IFS=.-: 00:04:39.614 09:28:02 -- scripts/common.sh@335 -- # read -ra ver1 00:04:39.614 09:28:02 -- scripts/common.sh@336 -- # IFS=.-: 00:04:39.614 09:28:02 -- scripts/common.sh@336 -- # read -ra ver2 00:04:39.614 09:28:02 -- scripts/common.sh@337 -- # local 'op=<' 00:04:39.614 09:28:02 -- scripts/common.sh@339 -- # ver1_l=2 00:04:39.614 09:28:02 -- scripts/common.sh@340 -- # ver2_l=1 00:04:39.614 09:28:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:39.614 09:28:02 -- scripts/common.sh@343 -- # case "$op" in 00:04:39.614 09:28:02 -- scripts/common.sh@344 -- # : 1 00:04:39.614 09:28:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:39.614 09:28:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:39.614 09:28:02 -- scripts/common.sh@364 -- # decimal 1 00:04:39.614 09:28:02 -- scripts/common.sh@352 -- # local d=1 00:04:39.614 09:28:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:39.614 09:28:02 -- scripts/common.sh@354 -- # echo 1 00:04:39.614 09:28:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:39.614 09:28:02 -- scripts/common.sh@365 -- # decimal 2 00:04:39.614 09:28:02 -- scripts/common.sh@352 -- # local d=2 00:04:39.614 09:28:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:39.614 09:28:02 -- scripts/common.sh@354 -- # echo 2 00:04:39.614 09:28:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:39.614 09:28:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:39.614 09:28:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:39.614 09:28:02 -- scripts/common.sh@367 -- # return 0 00:04:39.614 09:28:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:39.614 09:28:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:39.614 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.614 --rc genhtml_branch_coverage=1 00:04:39.614 --rc genhtml_function_coverage=1 00:04:39.614 --rc genhtml_legend=1 00:04:39.614 --rc geninfo_all_blocks=1 00:04:39.614 --rc geninfo_unexecuted_blocks=1 00:04:39.614 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.614 ' 00:04:39.614 09:28:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:39.614 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.614 --rc genhtml_branch_coverage=1 00:04:39.614 --rc genhtml_function_coverage=1 00:04:39.614 --rc genhtml_legend=1 00:04:39.614 --rc geninfo_all_blocks=1 00:04:39.614 --rc geninfo_unexecuted_blocks=1 00:04:39.614 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.614 ' 00:04:39.614 09:28:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:39.614 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.614 --rc genhtml_branch_coverage=1 00:04:39.614 --rc genhtml_function_coverage=1 00:04:39.614 --rc genhtml_legend=1 00:04:39.614 --rc geninfo_all_blocks=1 00:04:39.614 --rc geninfo_unexecuted_blocks=1 00:04:39.614 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.614 ' 00:04:39.614 09:28:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:39.614 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.614 --rc genhtml_branch_coverage=1 00:04:39.614 --rc genhtml_function_coverage=1 00:04:39.614 --rc genhtml_legend=1 00:04:39.614 --rc geninfo_all_blocks=1 00:04:39.614 --rc geninfo_unexecuted_blocks=1 00:04:39.614 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.614 ' 00:04:39.614 09:28:02 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:39.614 09:28:02 -- nvmf/common.sh@7 -- # uname -s 00:04:39.874 09:28:02 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:39.874 09:28:02 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:39.874 09:28:02 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:39.874 09:28:02 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:39.874 09:28:02 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:39.874 09:28:02 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:39.874 09:28:02 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:39.874 09:28:02 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:39.874 09:28:02 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:39.874 09:28:02 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:39.874 09:28:02 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:39.874 09:28:02 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:39.874 09:28:02 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:39.874 09:28:02 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:39.874 09:28:02 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:39.874 09:28:02 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:39.874 09:28:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:39.874 09:28:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:39.874 09:28:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:39.874 09:28:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.874 09:28:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.874 09:28:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.874 09:28:02 -- paths/export.sh@5 -- # export PATH 00:04:39.874 09:28:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.874 09:28:02 -- nvmf/common.sh@46 -- # : 0 00:04:39.874 09:28:02 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:39.874 09:28:02 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:39.874 09:28:02 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:39.874 09:28:02 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:39.874 09:28:02 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:39.874 09:28:02 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:39.874 09:28:02 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:39.874 09:28:02 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:39.874 09:28:02 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:04:39.874 09:28:02 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:04:39.874 09:28:02 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:04:39.874 09:28:02 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:04:39.874 09:28:02 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:04:39.874 WARNING: No tests are enabled so not running JSON configuration tests 00:04:39.874 09:28:02 -- json_config/json_config.sh@27 -- # exit 0 00:04:39.874 00:04:39.874 real 0m0.190s 00:04:39.874 user 0m0.096s 00:04:39.874 sys 0m0.102s 00:04:39.875 09:28:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:39.875 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.875 ************************************ 00:04:39.875 END TEST json_config 00:04:39.875 ************************************ 00:04:39.875 09:28:02 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:04:39.875 09:28:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:39.875 09:28:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:39.875 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.875 ************************************ 00:04:39.875 START TEST json_config_extra_key 00:04:39.875 ************************************ 00:04:39.875 09:28:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:04:39.875 09:28:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:39.875 09:28:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:39.875 09:28:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:39.875 09:28:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:39.875 09:28:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:39.875 09:28:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:39.875 09:28:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:39.875 09:28:02 -- scripts/common.sh@335 -- # IFS=.-: 00:04:39.875 09:28:02 -- scripts/common.sh@335 -- # read -ra ver1 00:04:39.875 09:28:02 -- scripts/common.sh@336 -- # IFS=.-: 00:04:39.875 09:28:02 -- scripts/common.sh@336 -- # read -ra ver2 00:04:39.875 09:28:02 -- scripts/common.sh@337 -- # local 'op=<' 00:04:39.875 09:28:02 -- scripts/common.sh@339 -- # ver1_l=2 00:04:39.875 09:28:02 -- scripts/common.sh@340 -- # ver2_l=1 00:04:39.875 09:28:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:39.875 09:28:02 -- scripts/common.sh@343 -- # case "$op" in 00:04:39.875 09:28:02 -- scripts/common.sh@344 -- # : 1 00:04:39.875 09:28:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:39.875 09:28:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:39.875 09:28:02 -- scripts/common.sh@364 -- # decimal 1 00:04:39.875 09:28:02 -- scripts/common.sh@352 -- # local d=1 00:04:39.875 09:28:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:39.875 09:28:02 -- scripts/common.sh@354 -- # echo 1 00:04:39.875 09:28:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:39.875 09:28:02 -- scripts/common.sh@365 -- # decimal 2 00:04:39.875 09:28:02 -- scripts/common.sh@352 -- # local d=2 00:04:39.875 09:28:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:39.875 09:28:02 -- scripts/common.sh@354 -- # echo 2 00:04:39.875 09:28:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:39.875 09:28:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:39.875 09:28:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:39.875 09:28:02 -- scripts/common.sh@367 -- # return 0 00:04:39.875 09:28:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:39.875 09:28:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:39.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.875 --rc genhtml_branch_coverage=1 00:04:39.875 --rc genhtml_function_coverage=1 00:04:39.875 --rc genhtml_legend=1 00:04:39.875 --rc geninfo_all_blocks=1 00:04:39.875 --rc geninfo_unexecuted_blocks=1 00:04:39.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.875 ' 00:04:39.875 09:28:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:39.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.875 --rc genhtml_branch_coverage=1 00:04:39.875 --rc genhtml_function_coverage=1 00:04:39.875 --rc genhtml_legend=1 00:04:39.875 --rc geninfo_all_blocks=1 00:04:39.875 --rc geninfo_unexecuted_blocks=1 00:04:39.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.875 ' 00:04:39.875 09:28:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:39.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.875 --rc genhtml_branch_coverage=1 00:04:39.875 --rc genhtml_function_coverage=1 00:04:39.875 --rc genhtml_legend=1 00:04:39.875 --rc geninfo_all_blocks=1 00:04:39.875 --rc geninfo_unexecuted_blocks=1 00:04:39.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.875 ' 00:04:39.875 09:28:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:39.875 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:39.875 --rc genhtml_branch_coverage=1 00:04:39.875 --rc genhtml_function_coverage=1 00:04:39.875 --rc genhtml_legend=1 00:04:39.875 --rc geninfo_all_blocks=1 00:04:39.875 --rc geninfo_unexecuted_blocks=1 00:04:39.875 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:39.875 ' 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:39.875 09:28:02 -- nvmf/common.sh@7 -- # uname -s 00:04:39.875 09:28:02 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:39.875 09:28:02 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:39.875 09:28:02 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:39.875 09:28:02 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:39.875 09:28:02 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:39.875 09:28:02 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:39.875 09:28:02 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:39.875 09:28:02 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:39.875 09:28:02 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:39.875 09:28:02 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:39.875 09:28:02 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:04:39.875 09:28:02 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:04:39.875 09:28:02 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:39.875 09:28:02 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:39.875 09:28:02 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:39.875 09:28:02 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:39.875 09:28:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:39.875 09:28:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:39.875 09:28:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:39.875 09:28:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.875 09:28:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.875 09:28:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.875 09:28:02 -- paths/export.sh@5 -- # export PATH 00:04:39.875 09:28:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:39.875 09:28:02 -- nvmf/common.sh@46 -- # : 0 00:04:39.875 09:28:02 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:04:39.875 09:28:02 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:04:39.875 09:28:02 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:04:39.875 09:28:02 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:39.875 09:28:02 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:39.875 09:28:02 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:04:39.875 09:28:02 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:04:39.875 09:28:02 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:04:39.875 INFO: launching applications... 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@25 -- # shift 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=3155835 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:04:39.875 Waiting for target to run... 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:04:39.875 09:28:02 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 3155835 /var/tmp/spdk_tgt.sock 00:04:39.875 09:28:02 -- common/autotest_common.sh@829 -- # '[' -z 3155835 ']' 00:04:39.875 09:28:02 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:04:39.875 09:28:02 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:39.876 09:28:02 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:04:39.876 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:04:39.876 09:28:02 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:39.876 09:28:02 -- common/autotest_common.sh@10 -- # set +x 00:04:39.876 [2024-11-29 09:28:02.714945] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:39.876 [2024-11-29 09:28:02.715012] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3155835 ] 00:04:40.135 EAL: No free 2048 kB hugepages reported on node 1 00:04:40.394 [2024-11-29 09:28:02.988153] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:40.394 [2024-11-29 09:28:03.051919] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:40.394 [2024-11-29 09:28:03.052027] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:40.962 09:28:03 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:40.962 09:28:03 -- common/autotest_common.sh@862 -- # return 0 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:04:40.962 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:04:40.962 INFO: shutting down applications... 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 3155835 ]] 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 3155835 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3155835 00:04:40.962 09:28:03 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3155835 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@52 -- # break 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:04:41.221 SPDK target shutdown done 00:04:41.221 09:28:04 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:04:41.221 Success 00:04:41.221 00:04:41.221 real 0m1.518s 00:04:41.221 user 0m1.275s 00:04:41.221 sys 0m0.388s 00:04:41.221 09:28:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:41.221 09:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:41.221 ************************************ 00:04:41.221 END TEST json_config_extra_key 00:04:41.221 ************************************ 00:04:41.480 09:28:04 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:04:41.480 09:28:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:41.480 09:28:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:41.480 09:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:41.480 ************************************ 00:04:41.480 START TEST alias_rpc 00:04:41.480 ************************************ 00:04:41.480 09:28:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:04:41.480 * Looking for test storage... 00:04:41.480 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:04:41.480 09:28:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:41.480 09:28:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:41.480 09:28:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:41.480 09:28:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:41.480 09:28:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:41.480 09:28:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:41.480 09:28:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:41.480 09:28:04 -- scripts/common.sh@335 -- # IFS=.-: 00:04:41.480 09:28:04 -- scripts/common.sh@335 -- # read -ra ver1 00:04:41.480 09:28:04 -- scripts/common.sh@336 -- # IFS=.-: 00:04:41.480 09:28:04 -- scripts/common.sh@336 -- # read -ra ver2 00:04:41.480 09:28:04 -- scripts/common.sh@337 -- # local 'op=<' 00:04:41.480 09:28:04 -- scripts/common.sh@339 -- # ver1_l=2 00:04:41.480 09:28:04 -- scripts/common.sh@340 -- # ver2_l=1 00:04:41.480 09:28:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:41.480 09:28:04 -- scripts/common.sh@343 -- # case "$op" in 00:04:41.480 09:28:04 -- scripts/common.sh@344 -- # : 1 00:04:41.480 09:28:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:41.480 09:28:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:41.480 09:28:04 -- scripts/common.sh@364 -- # decimal 1 00:04:41.480 09:28:04 -- scripts/common.sh@352 -- # local d=1 00:04:41.480 09:28:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:41.480 09:28:04 -- scripts/common.sh@354 -- # echo 1 00:04:41.480 09:28:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:41.480 09:28:04 -- scripts/common.sh@365 -- # decimal 2 00:04:41.480 09:28:04 -- scripts/common.sh@352 -- # local d=2 00:04:41.480 09:28:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:41.480 09:28:04 -- scripts/common.sh@354 -- # echo 2 00:04:41.480 09:28:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:41.480 09:28:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:41.480 09:28:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:41.480 09:28:04 -- scripts/common.sh@367 -- # return 0 00:04:41.480 09:28:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:41.480 09:28:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:41.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.480 --rc genhtml_branch_coverage=1 00:04:41.480 --rc genhtml_function_coverage=1 00:04:41.480 --rc genhtml_legend=1 00:04:41.480 --rc geninfo_all_blocks=1 00:04:41.480 --rc geninfo_unexecuted_blocks=1 00:04:41.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.480 ' 00:04:41.480 09:28:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:41.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.480 --rc genhtml_branch_coverage=1 00:04:41.480 --rc genhtml_function_coverage=1 00:04:41.480 --rc genhtml_legend=1 00:04:41.480 --rc geninfo_all_blocks=1 00:04:41.480 --rc geninfo_unexecuted_blocks=1 00:04:41.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.480 ' 00:04:41.480 09:28:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:41.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.480 --rc genhtml_branch_coverage=1 00:04:41.480 --rc genhtml_function_coverage=1 00:04:41.480 --rc genhtml_legend=1 00:04:41.480 --rc geninfo_all_blocks=1 00:04:41.480 --rc geninfo_unexecuted_blocks=1 00:04:41.480 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.480 ' 00:04:41.480 09:28:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:41.480 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:41.480 --rc genhtml_branch_coverage=1 00:04:41.481 --rc genhtml_function_coverage=1 00:04:41.481 --rc genhtml_legend=1 00:04:41.481 --rc geninfo_all_blocks=1 00:04:41.481 --rc geninfo_unexecuted_blocks=1 00:04:41.481 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:41.481 ' 00:04:41.481 09:28:04 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:04:41.481 09:28:04 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3156160 00:04:41.481 09:28:04 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3156160 00:04:41.481 09:28:04 -- common/autotest_common.sh@829 -- # '[' -z 3156160 ']' 00:04:41.481 09:28:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:41.481 09:28:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:41.481 09:28:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:41.481 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:41.481 09:28:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:41.481 09:28:04 -- common/autotest_common.sh@10 -- # set +x 00:04:41.481 09:28:04 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:41.481 [2024-11-29 09:28:04.291198] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:41.481 [2024-11-29 09:28:04.291285] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156160 ] 00:04:41.739 EAL: No free 2048 kB hugepages reported on node 1 00:04:41.739 [2024-11-29 09:28:04.360479] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:41.739 [2024-11-29 09:28:04.430822] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:41.739 [2024-11-29 09:28:04.430949] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:42.306 09:28:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:42.306 09:28:05 -- common/autotest_common.sh@862 -- # return 0 00:04:42.306 09:28:05 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:04:42.563 09:28:05 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3156160 00:04:42.563 09:28:05 -- common/autotest_common.sh@936 -- # '[' -z 3156160 ']' 00:04:42.563 09:28:05 -- common/autotest_common.sh@940 -- # kill -0 3156160 00:04:42.563 09:28:05 -- common/autotest_common.sh@941 -- # uname 00:04:42.563 09:28:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:42.563 09:28:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3156160 00:04:42.563 09:28:05 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:42.563 09:28:05 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:42.563 09:28:05 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3156160' 00:04:42.563 killing process with pid 3156160 00:04:42.563 09:28:05 -- common/autotest_common.sh@955 -- # kill 3156160 00:04:42.563 09:28:05 -- common/autotest_common.sh@960 -- # wait 3156160 00:04:43.130 00:04:43.130 real 0m1.609s 00:04:43.130 user 0m1.740s 00:04:43.130 sys 0m0.457s 00:04:43.130 09:28:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:43.130 09:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.130 ************************************ 00:04:43.130 END TEST alias_rpc 00:04:43.130 ************************************ 00:04:43.130 09:28:05 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:04:43.130 09:28:05 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:04:43.130 09:28:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:43.130 09:28:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:43.130 09:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.130 ************************************ 00:04:43.130 START TEST spdkcli_tcp 00:04:43.130 ************************************ 00:04:43.130 09:28:05 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:04:43.130 * Looking for test storage... 00:04:43.130 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:04:43.130 09:28:05 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:43.130 09:28:05 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:43.130 09:28:05 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:43.130 09:28:05 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:43.130 09:28:05 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:43.130 09:28:05 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:43.130 09:28:05 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:43.130 09:28:05 -- scripts/common.sh@335 -- # IFS=.-: 00:04:43.130 09:28:05 -- scripts/common.sh@335 -- # read -ra ver1 00:04:43.130 09:28:05 -- scripts/common.sh@336 -- # IFS=.-: 00:04:43.130 09:28:05 -- scripts/common.sh@336 -- # read -ra ver2 00:04:43.130 09:28:05 -- scripts/common.sh@337 -- # local 'op=<' 00:04:43.130 09:28:05 -- scripts/common.sh@339 -- # ver1_l=2 00:04:43.130 09:28:05 -- scripts/common.sh@340 -- # ver2_l=1 00:04:43.130 09:28:05 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:43.130 09:28:05 -- scripts/common.sh@343 -- # case "$op" in 00:04:43.130 09:28:05 -- scripts/common.sh@344 -- # : 1 00:04:43.130 09:28:05 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:43.130 09:28:05 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:43.130 09:28:05 -- scripts/common.sh@364 -- # decimal 1 00:04:43.130 09:28:05 -- scripts/common.sh@352 -- # local d=1 00:04:43.130 09:28:05 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:43.130 09:28:05 -- scripts/common.sh@354 -- # echo 1 00:04:43.130 09:28:05 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:43.130 09:28:05 -- scripts/common.sh@365 -- # decimal 2 00:04:43.130 09:28:05 -- scripts/common.sh@352 -- # local d=2 00:04:43.130 09:28:05 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:43.130 09:28:05 -- scripts/common.sh@354 -- # echo 2 00:04:43.130 09:28:05 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:43.130 09:28:05 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:43.130 09:28:05 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:43.130 09:28:05 -- scripts/common.sh@367 -- # return 0 00:04:43.130 09:28:05 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:43.130 09:28:05 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:43.130 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.130 --rc genhtml_branch_coverage=1 00:04:43.130 --rc genhtml_function_coverage=1 00:04:43.130 --rc genhtml_legend=1 00:04:43.130 --rc geninfo_all_blocks=1 00:04:43.130 --rc geninfo_unexecuted_blocks=1 00:04:43.130 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.130 ' 00:04:43.130 09:28:05 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:43.130 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.130 --rc genhtml_branch_coverage=1 00:04:43.130 --rc genhtml_function_coverage=1 00:04:43.130 --rc genhtml_legend=1 00:04:43.130 --rc geninfo_all_blocks=1 00:04:43.130 --rc geninfo_unexecuted_blocks=1 00:04:43.130 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.130 ' 00:04:43.130 09:28:05 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:43.130 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.131 --rc genhtml_branch_coverage=1 00:04:43.131 --rc genhtml_function_coverage=1 00:04:43.131 --rc genhtml_legend=1 00:04:43.131 --rc geninfo_all_blocks=1 00:04:43.131 --rc geninfo_unexecuted_blocks=1 00:04:43.131 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.131 ' 00:04:43.131 09:28:05 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:43.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:43.131 --rc genhtml_branch_coverage=1 00:04:43.131 --rc genhtml_function_coverage=1 00:04:43.131 --rc genhtml_legend=1 00:04:43.131 --rc geninfo_all_blocks=1 00:04:43.131 --rc geninfo_unexecuted_blocks=1 00:04:43.131 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:43.131 ' 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:04:43.131 09:28:05 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:04:43.131 09:28:05 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:04:43.131 09:28:05 -- common/autotest_common.sh@722 -- # xtrace_disable 00:04:43.131 09:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3156497 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@27 -- # waitforlisten 3156497 00:04:43.131 09:28:05 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:04:43.131 09:28:05 -- common/autotest_common.sh@829 -- # '[' -z 3156497 ']' 00:04:43.131 09:28:05 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:43.131 09:28:05 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:43.131 09:28:05 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:43.131 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:43.131 09:28:05 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:43.131 09:28:05 -- common/autotest_common.sh@10 -- # set +x 00:04:43.131 [2024-11-29 09:28:05.943459] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:43.131 [2024-11-29 09:28:05.943529] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156497 ] 00:04:43.390 EAL: No free 2048 kB hugepages reported on node 1 00:04:43.390 [2024-11-29 09:28:06.010463] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:43.390 [2024-11-29 09:28:06.086317] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:43.390 [2024-11-29 09:28:06.086473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:43.390 [2024-11-29 09:28:06.086475] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:43.957 09:28:06 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:43.957 09:28:06 -- common/autotest_common.sh@862 -- # return 0 00:04:43.957 09:28:06 -- spdkcli/tcp.sh@31 -- # socat_pid=3156744 00:04:43.957 09:28:06 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:04:43.957 09:28:06 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:04:44.216 [ 00:04:44.216 "spdk_get_version", 00:04:44.216 "rpc_get_methods", 00:04:44.216 "trace_get_info", 00:04:44.216 "trace_get_tpoint_group_mask", 00:04:44.216 "trace_disable_tpoint_group", 00:04:44.216 "trace_enable_tpoint_group", 00:04:44.216 "trace_clear_tpoint_mask", 00:04:44.216 "trace_set_tpoint_mask", 00:04:44.216 "vfu_tgt_set_base_path", 00:04:44.217 "framework_get_pci_devices", 00:04:44.217 "framework_get_config", 00:04:44.217 "framework_get_subsystems", 00:04:44.217 "iobuf_get_stats", 00:04:44.217 "iobuf_set_options", 00:04:44.217 "sock_set_default_impl", 00:04:44.217 "sock_impl_set_options", 00:04:44.217 "sock_impl_get_options", 00:04:44.217 "vmd_rescan", 00:04:44.217 "vmd_remove_device", 00:04:44.217 "vmd_enable", 00:04:44.217 "accel_get_stats", 00:04:44.217 "accel_set_options", 00:04:44.217 "accel_set_driver", 00:04:44.217 "accel_crypto_key_destroy", 00:04:44.217 "accel_crypto_keys_get", 00:04:44.217 "accel_crypto_key_create", 00:04:44.217 "accel_assign_opc", 00:04:44.217 "accel_get_module_info", 00:04:44.217 "accel_get_opc_assignments", 00:04:44.217 "notify_get_notifications", 00:04:44.217 "notify_get_types", 00:04:44.217 "bdev_get_histogram", 00:04:44.217 "bdev_enable_histogram", 00:04:44.217 "bdev_set_qos_limit", 00:04:44.217 "bdev_set_qd_sampling_period", 00:04:44.217 "bdev_get_bdevs", 00:04:44.217 "bdev_reset_iostat", 00:04:44.217 "bdev_get_iostat", 00:04:44.217 "bdev_examine", 00:04:44.217 "bdev_wait_for_examine", 00:04:44.217 "bdev_set_options", 00:04:44.217 "scsi_get_devices", 00:04:44.217 "thread_set_cpumask", 00:04:44.217 "framework_get_scheduler", 00:04:44.217 "framework_set_scheduler", 00:04:44.217 "framework_get_reactors", 00:04:44.217 "thread_get_io_channels", 00:04:44.217 "thread_get_pollers", 00:04:44.217 "thread_get_stats", 00:04:44.217 "framework_monitor_context_switch", 00:04:44.217 "spdk_kill_instance", 00:04:44.217 "log_enable_timestamps", 00:04:44.217 "log_get_flags", 00:04:44.217 "log_clear_flag", 00:04:44.217 "log_set_flag", 00:04:44.217 "log_get_level", 00:04:44.217 "log_set_level", 00:04:44.217 "log_get_print_level", 00:04:44.217 "log_set_print_level", 00:04:44.217 "framework_enable_cpumask_locks", 00:04:44.217 "framework_disable_cpumask_locks", 00:04:44.217 "framework_wait_init", 00:04:44.217 "framework_start_init", 00:04:44.217 "virtio_blk_create_transport", 00:04:44.217 "virtio_blk_get_transports", 00:04:44.217 "vhost_controller_set_coalescing", 00:04:44.217 "vhost_get_controllers", 00:04:44.217 "vhost_delete_controller", 00:04:44.217 "vhost_create_blk_controller", 00:04:44.217 "vhost_scsi_controller_remove_target", 00:04:44.217 "vhost_scsi_controller_add_target", 00:04:44.217 "vhost_start_scsi_controller", 00:04:44.217 "vhost_create_scsi_controller", 00:04:44.217 "ublk_recover_disk", 00:04:44.217 "ublk_get_disks", 00:04:44.217 "ublk_stop_disk", 00:04:44.217 "ublk_start_disk", 00:04:44.217 "ublk_destroy_target", 00:04:44.217 "ublk_create_target", 00:04:44.217 "nbd_get_disks", 00:04:44.217 "nbd_stop_disk", 00:04:44.217 "nbd_start_disk", 00:04:44.217 "env_dpdk_get_mem_stats", 00:04:44.217 "nvmf_subsystem_get_listeners", 00:04:44.217 "nvmf_subsystem_get_qpairs", 00:04:44.217 "nvmf_subsystem_get_controllers", 00:04:44.217 "nvmf_get_stats", 00:04:44.217 "nvmf_get_transports", 00:04:44.217 "nvmf_create_transport", 00:04:44.217 "nvmf_get_targets", 00:04:44.217 "nvmf_delete_target", 00:04:44.217 "nvmf_create_target", 00:04:44.217 "nvmf_subsystem_allow_any_host", 00:04:44.217 "nvmf_subsystem_remove_host", 00:04:44.217 "nvmf_subsystem_add_host", 00:04:44.217 "nvmf_subsystem_remove_ns", 00:04:44.217 "nvmf_subsystem_add_ns", 00:04:44.217 "nvmf_subsystem_listener_set_ana_state", 00:04:44.217 "nvmf_discovery_get_referrals", 00:04:44.217 "nvmf_discovery_remove_referral", 00:04:44.217 "nvmf_discovery_add_referral", 00:04:44.217 "nvmf_subsystem_remove_listener", 00:04:44.217 "nvmf_subsystem_add_listener", 00:04:44.217 "nvmf_delete_subsystem", 00:04:44.217 "nvmf_create_subsystem", 00:04:44.217 "nvmf_get_subsystems", 00:04:44.217 "nvmf_set_crdt", 00:04:44.217 "nvmf_set_config", 00:04:44.217 "nvmf_set_max_subsystems", 00:04:44.217 "iscsi_set_options", 00:04:44.217 "iscsi_get_auth_groups", 00:04:44.217 "iscsi_auth_group_remove_secret", 00:04:44.217 "iscsi_auth_group_add_secret", 00:04:44.217 "iscsi_delete_auth_group", 00:04:44.217 "iscsi_create_auth_group", 00:04:44.217 "iscsi_set_discovery_auth", 00:04:44.217 "iscsi_get_options", 00:04:44.217 "iscsi_target_node_request_logout", 00:04:44.217 "iscsi_target_node_set_redirect", 00:04:44.217 "iscsi_target_node_set_auth", 00:04:44.217 "iscsi_target_node_add_lun", 00:04:44.217 "iscsi_get_connections", 00:04:44.217 "iscsi_portal_group_set_auth", 00:04:44.217 "iscsi_start_portal_group", 00:04:44.217 "iscsi_delete_portal_group", 00:04:44.217 "iscsi_create_portal_group", 00:04:44.217 "iscsi_get_portal_groups", 00:04:44.217 "iscsi_delete_target_node", 00:04:44.217 "iscsi_target_node_remove_pg_ig_maps", 00:04:44.217 "iscsi_target_node_add_pg_ig_maps", 00:04:44.217 "iscsi_create_target_node", 00:04:44.217 "iscsi_get_target_nodes", 00:04:44.217 "iscsi_delete_initiator_group", 00:04:44.217 "iscsi_initiator_group_remove_initiators", 00:04:44.217 "iscsi_initiator_group_add_initiators", 00:04:44.217 "iscsi_create_initiator_group", 00:04:44.217 "iscsi_get_initiator_groups", 00:04:44.217 "vfu_virtio_create_scsi_endpoint", 00:04:44.217 "vfu_virtio_scsi_remove_target", 00:04:44.217 "vfu_virtio_scsi_add_target", 00:04:44.217 "vfu_virtio_create_blk_endpoint", 00:04:44.217 "vfu_virtio_delete_endpoint", 00:04:44.217 "iaa_scan_accel_module", 00:04:44.217 "dsa_scan_accel_module", 00:04:44.217 "ioat_scan_accel_module", 00:04:44.217 "accel_error_inject_error", 00:04:44.217 "bdev_iscsi_delete", 00:04:44.217 "bdev_iscsi_create", 00:04:44.217 "bdev_iscsi_set_options", 00:04:44.217 "bdev_virtio_attach_controller", 00:04:44.217 "bdev_virtio_scsi_get_devices", 00:04:44.217 "bdev_virtio_detach_controller", 00:04:44.217 "bdev_virtio_blk_set_hotplug", 00:04:44.217 "bdev_ftl_set_property", 00:04:44.217 "bdev_ftl_get_properties", 00:04:44.217 "bdev_ftl_get_stats", 00:04:44.217 "bdev_ftl_unmap", 00:04:44.217 "bdev_ftl_unload", 00:04:44.217 "bdev_ftl_delete", 00:04:44.217 "bdev_ftl_load", 00:04:44.217 "bdev_ftl_create", 00:04:44.217 "bdev_aio_delete", 00:04:44.217 "bdev_aio_rescan", 00:04:44.217 "bdev_aio_create", 00:04:44.217 "blobfs_create", 00:04:44.217 "blobfs_detect", 00:04:44.217 "blobfs_set_cache_size", 00:04:44.217 "bdev_zone_block_delete", 00:04:44.217 "bdev_zone_block_create", 00:04:44.217 "bdev_delay_delete", 00:04:44.217 "bdev_delay_create", 00:04:44.217 "bdev_delay_update_latency", 00:04:44.217 "bdev_split_delete", 00:04:44.217 "bdev_split_create", 00:04:44.217 "bdev_error_inject_error", 00:04:44.217 "bdev_error_delete", 00:04:44.217 "bdev_error_create", 00:04:44.217 "bdev_raid_set_options", 00:04:44.217 "bdev_raid_remove_base_bdev", 00:04:44.217 "bdev_raid_add_base_bdev", 00:04:44.217 "bdev_raid_delete", 00:04:44.217 "bdev_raid_create", 00:04:44.217 "bdev_raid_get_bdevs", 00:04:44.217 "bdev_lvol_grow_lvstore", 00:04:44.217 "bdev_lvol_get_lvols", 00:04:44.217 "bdev_lvol_get_lvstores", 00:04:44.217 "bdev_lvol_delete", 00:04:44.217 "bdev_lvol_set_read_only", 00:04:44.217 "bdev_lvol_resize", 00:04:44.217 "bdev_lvol_decouple_parent", 00:04:44.217 "bdev_lvol_inflate", 00:04:44.217 "bdev_lvol_rename", 00:04:44.217 "bdev_lvol_clone_bdev", 00:04:44.217 "bdev_lvol_clone", 00:04:44.217 "bdev_lvol_snapshot", 00:04:44.217 "bdev_lvol_create", 00:04:44.217 "bdev_lvol_delete_lvstore", 00:04:44.217 "bdev_lvol_rename_lvstore", 00:04:44.217 "bdev_lvol_create_lvstore", 00:04:44.217 "bdev_passthru_delete", 00:04:44.217 "bdev_passthru_create", 00:04:44.217 "bdev_nvme_cuse_unregister", 00:04:44.217 "bdev_nvme_cuse_register", 00:04:44.217 "bdev_opal_new_user", 00:04:44.217 "bdev_opal_set_lock_state", 00:04:44.217 "bdev_opal_delete", 00:04:44.217 "bdev_opal_get_info", 00:04:44.217 "bdev_opal_create", 00:04:44.217 "bdev_nvme_opal_revert", 00:04:44.217 "bdev_nvme_opal_init", 00:04:44.217 "bdev_nvme_send_cmd", 00:04:44.217 "bdev_nvme_get_path_iostat", 00:04:44.217 "bdev_nvme_get_mdns_discovery_info", 00:04:44.217 "bdev_nvme_stop_mdns_discovery", 00:04:44.217 "bdev_nvme_start_mdns_discovery", 00:04:44.217 "bdev_nvme_set_multipath_policy", 00:04:44.217 "bdev_nvme_set_preferred_path", 00:04:44.217 "bdev_nvme_get_io_paths", 00:04:44.217 "bdev_nvme_remove_error_injection", 00:04:44.217 "bdev_nvme_add_error_injection", 00:04:44.217 "bdev_nvme_get_discovery_info", 00:04:44.217 "bdev_nvme_stop_discovery", 00:04:44.217 "bdev_nvme_start_discovery", 00:04:44.217 "bdev_nvme_get_controller_health_info", 00:04:44.217 "bdev_nvme_disable_controller", 00:04:44.217 "bdev_nvme_enable_controller", 00:04:44.217 "bdev_nvme_reset_controller", 00:04:44.217 "bdev_nvme_get_transport_statistics", 00:04:44.217 "bdev_nvme_apply_firmware", 00:04:44.217 "bdev_nvme_detach_controller", 00:04:44.217 "bdev_nvme_get_controllers", 00:04:44.217 "bdev_nvme_attach_controller", 00:04:44.217 "bdev_nvme_set_hotplug", 00:04:44.217 "bdev_nvme_set_options", 00:04:44.217 "bdev_null_resize", 00:04:44.217 "bdev_null_delete", 00:04:44.217 "bdev_null_create", 00:04:44.217 "bdev_malloc_delete", 00:04:44.217 "bdev_malloc_create" 00:04:44.217 ] 00:04:44.217 09:28:06 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:04:44.217 09:28:06 -- common/autotest_common.sh@728 -- # xtrace_disable 00:04:44.217 09:28:06 -- common/autotest_common.sh@10 -- # set +x 00:04:44.218 09:28:06 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:04:44.218 09:28:06 -- spdkcli/tcp.sh@38 -- # killprocess 3156497 00:04:44.218 09:28:06 -- common/autotest_common.sh@936 -- # '[' -z 3156497 ']' 00:04:44.218 09:28:06 -- common/autotest_common.sh@940 -- # kill -0 3156497 00:04:44.218 09:28:06 -- common/autotest_common.sh@941 -- # uname 00:04:44.218 09:28:06 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:44.218 09:28:06 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3156497 00:04:44.218 09:28:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:44.218 09:28:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:44.218 09:28:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3156497' 00:04:44.218 killing process with pid 3156497 00:04:44.218 09:28:07 -- common/autotest_common.sh@955 -- # kill 3156497 00:04:44.218 09:28:07 -- common/autotest_common.sh@960 -- # wait 3156497 00:04:44.786 00:04:44.786 real 0m1.608s 00:04:44.786 user 0m2.913s 00:04:44.786 sys 0m0.514s 00:04:44.786 09:28:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:44.786 09:28:07 -- common/autotest_common.sh@10 -- # set +x 00:04:44.786 ************************************ 00:04:44.786 END TEST spdkcli_tcp 00:04:44.786 ************************************ 00:04:44.786 09:28:07 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:04:44.786 09:28:07 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:44.786 09:28:07 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:44.786 09:28:07 -- common/autotest_common.sh@10 -- # set +x 00:04:44.786 ************************************ 00:04:44.786 START TEST dpdk_mem_utility 00:04:44.786 ************************************ 00:04:44.786 09:28:07 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:04:44.786 * Looking for test storage... 00:04:44.786 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:04:44.786 09:28:07 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:44.786 09:28:07 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:44.786 09:28:07 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:44.786 09:28:07 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:44.786 09:28:07 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:44.786 09:28:07 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:44.786 09:28:07 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:44.786 09:28:07 -- scripts/common.sh@335 -- # IFS=.-: 00:04:44.786 09:28:07 -- scripts/common.sh@335 -- # read -ra ver1 00:04:44.786 09:28:07 -- scripts/common.sh@336 -- # IFS=.-: 00:04:44.786 09:28:07 -- scripts/common.sh@336 -- # read -ra ver2 00:04:44.786 09:28:07 -- scripts/common.sh@337 -- # local 'op=<' 00:04:44.786 09:28:07 -- scripts/common.sh@339 -- # ver1_l=2 00:04:44.786 09:28:07 -- scripts/common.sh@340 -- # ver2_l=1 00:04:44.786 09:28:07 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:44.786 09:28:07 -- scripts/common.sh@343 -- # case "$op" in 00:04:44.786 09:28:07 -- scripts/common.sh@344 -- # : 1 00:04:44.786 09:28:07 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:44.786 09:28:07 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:44.786 09:28:07 -- scripts/common.sh@364 -- # decimal 1 00:04:44.786 09:28:07 -- scripts/common.sh@352 -- # local d=1 00:04:44.786 09:28:07 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:44.786 09:28:07 -- scripts/common.sh@354 -- # echo 1 00:04:44.786 09:28:07 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:44.786 09:28:07 -- scripts/common.sh@365 -- # decimal 2 00:04:44.786 09:28:07 -- scripts/common.sh@352 -- # local d=2 00:04:44.786 09:28:07 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:44.786 09:28:07 -- scripts/common.sh@354 -- # echo 2 00:04:44.786 09:28:07 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:44.786 09:28:07 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:44.787 09:28:07 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:44.787 09:28:07 -- scripts/common.sh@367 -- # return 0 00:04:44.787 09:28:07 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:44.787 09:28:07 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:44.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.787 --rc genhtml_branch_coverage=1 00:04:44.787 --rc genhtml_function_coverage=1 00:04:44.787 --rc genhtml_legend=1 00:04:44.787 --rc geninfo_all_blocks=1 00:04:44.787 --rc geninfo_unexecuted_blocks=1 00:04:44.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.787 ' 00:04:44.787 09:28:07 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:44.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.787 --rc genhtml_branch_coverage=1 00:04:44.787 --rc genhtml_function_coverage=1 00:04:44.787 --rc genhtml_legend=1 00:04:44.787 --rc geninfo_all_blocks=1 00:04:44.787 --rc geninfo_unexecuted_blocks=1 00:04:44.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.787 ' 00:04:44.787 09:28:07 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:44.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.787 --rc genhtml_branch_coverage=1 00:04:44.787 --rc genhtml_function_coverage=1 00:04:44.787 --rc genhtml_legend=1 00:04:44.787 --rc geninfo_all_blocks=1 00:04:44.787 --rc geninfo_unexecuted_blocks=1 00:04:44.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.787 ' 00:04:44.787 09:28:07 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:44.787 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:44.787 --rc genhtml_branch_coverage=1 00:04:44.787 --rc genhtml_function_coverage=1 00:04:44.787 --rc genhtml_legend=1 00:04:44.787 --rc geninfo_all_blocks=1 00:04:44.787 --rc geninfo_unexecuted_blocks=1 00:04:44.787 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:44.787 ' 00:04:44.787 09:28:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:04:44.787 09:28:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:04:44.787 09:28:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3156842 00:04:44.787 09:28:07 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3156842 00:04:44.787 09:28:07 -- common/autotest_common.sh@829 -- # '[' -z 3156842 ']' 00:04:44.787 09:28:07 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:44.787 09:28:07 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:44.787 09:28:07 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:44.787 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:44.787 09:28:07 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:44.787 09:28:07 -- common/autotest_common.sh@10 -- # set +x 00:04:44.787 [2024-11-29 09:28:07.599894] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:44.787 [2024-11-29 09:28:07.599952] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3156842 ] 00:04:45.047 EAL: No free 2048 kB hugepages reported on node 1 00:04:45.047 [2024-11-29 09:28:07.666465] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:45.047 [2024-11-29 09:28:07.741372] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:04:45.047 [2024-11-29 09:28:07.741481] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:45.615 09:28:08 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:45.615 09:28:08 -- common/autotest_common.sh@862 -- # return 0 00:04:45.615 09:28:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:04:45.615 09:28:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:04:45.615 09:28:08 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:45.615 09:28:08 -- common/autotest_common.sh@10 -- # set +x 00:04:45.875 { 00:04:45.875 "filename": "/tmp/spdk_mem_dump.txt" 00:04:45.875 } 00:04:45.875 09:28:08 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:45.875 09:28:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:04:45.875 DPDK memory size 814.000000 MiB in 1 heap(s) 00:04:45.875 1 heaps totaling size 814.000000 MiB 00:04:45.875 size: 814.000000 MiB heap id: 0 00:04:45.875 end heaps---------- 00:04:45.875 8 mempools totaling size 598.116089 MiB 00:04:45.875 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:04:45.875 size: 158.602051 MiB name: PDU_data_out_Pool 00:04:45.875 size: 84.521057 MiB name: bdev_io_3156842 00:04:45.875 size: 51.011292 MiB name: evtpool_3156842 00:04:45.875 size: 50.003479 MiB name: msgpool_3156842 00:04:45.875 size: 21.763794 MiB name: PDU_Pool 00:04:45.875 size: 19.513306 MiB name: SCSI_TASK_Pool 00:04:45.875 size: 0.026123 MiB name: Session_Pool 00:04:45.875 end mempools------- 00:04:45.875 6 memzones totaling size 4.142822 MiB 00:04:45.875 size: 1.000366 MiB name: RG_ring_0_3156842 00:04:45.875 size: 1.000366 MiB name: RG_ring_1_3156842 00:04:45.875 size: 1.000366 MiB name: RG_ring_4_3156842 00:04:45.875 size: 1.000366 MiB name: RG_ring_5_3156842 00:04:45.875 size: 0.125366 MiB name: RG_ring_2_3156842 00:04:45.875 size: 0.015991 MiB name: RG_ring_3_3156842 00:04:45.875 end memzones------- 00:04:45.875 09:28:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:04:45.875 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:04:45.875 list of free elements. size: 12.519348 MiB 00:04:45.875 element at address: 0x200000400000 with size: 1.999512 MiB 00:04:45.875 element at address: 0x200018e00000 with size: 0.999878 MiB 00:04:45.875 element at address: 0x200019000000 with size: 0.999878 MiB 00:04:45.875 element at address: 0x200003e00000 with size: 0.996277 MiB 00:04:45.875 element at address: 0x200031c00000 with size: 0.994446 MiB 00:04:45.875 element at address: 0x200013800000 with size: 0.978699 MiB 00:04:45.875 element at address: 0x200007000000 with size: 0.959839 MiB 00:04:45.875 element at address: 0x200019200000 with size: 0.936584 MiB 00:04:45.875 element at address: 0x200000200000 with size: 0.841614 MiB 00:04:45.875 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:04:45.875 element at address: 0x20000b200000 with size: 0.490723 MiB 00:04:45.875 element at address: 0x200000800000 with size: 0.487793 MiB 00:04:45.875 element at address: 0x200019400000 with size: 0.485657 MiB 00:04:45.875 element at address: 0x200027e00000 with size: 0.410034 MiB 00:04:45.875 element at address: 0x200003a00000 with size: 0.355530 MiB 00:04:45.875 list of standard malloc elements. size: 199.218079 MiB 00:04:45.875 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:04:45.875 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:04:45.875 element at address: 0x200018efff80 with size: 1.000122 MiB 00:04:45.875 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:04:45.875 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:04:45.875 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:04:45.875 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:04:45.875 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:04:45.875 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:04:45.875 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003adb300 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003adb500 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003affa80 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003affb40 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:04:45.875 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:04:45.875 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200027e69040 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:04:45.875 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:04:45.875 list of memzone associated elements. size: 602.262573 MiB 00:04:45.875 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:04:45.875 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:04:45.875 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:04:45.875 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:04:45.875 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:04:45.875 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3156842_0 00:04:45.875 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:04:45.875 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3156842_0 00:04:45.875 element at address: 0x200003fff380 with size: 48.003052 MiB 00:04:45.875 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3156842_0 00:04:45.875 element at address: 0x2000195be940 with size: 20.255554 MiB 00:04:45.875 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:04:45.875 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:04:45.875 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:04:45.875 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:04:45.875 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3156842 00:04:45.875 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:04:45.875 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3156842 00:04:45.875 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:04:45.875 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3156842 00:04:45.875 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:04:45.875 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:04:45.875 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:04:45.875 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:04:45.875 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:04:45.875 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:04:45.875 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:04:45.875 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:04:45.875 element at address: 0x200003eff180 with size: 1.000488 MiB 00:04:45.875 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3156842 00:04:45.875 element at address: 0x200003affc00 with size: 1.000488 MiB 00:04:45.875 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3156842 00:04:45.875 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:04:45.875 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3156842 00:04:45.875 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:04:45.875 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3156842 00:04:45.875 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:04:45.875 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3156842 00:04:45.875 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:04:45.875 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:04:45.875 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:04:45.875 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:04:45.875 element at address: 0x20001947c540 with size: 0.250488 MiB 00:04:45.875 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:04:45.875 element at address: 0x200003adf880 with size: 0.125488 MiB 00:04:45.875 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3156842 00:04:45.875 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:04:45.875 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:04:45.875 element at address: 0x200027e69100 with size: 0.023743 MiB 00:04:45.876 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:04:45.876 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:04:45.876 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3156842 00:04:45.876 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:04:45.876 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:04:45.876 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:04:45.876 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3156842 00:04:45.876 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:04:45.876 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3156842 00:04:45.876 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:04:45.876 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:04:45.876 09:28:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:04:45.876 09:28:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3156842 00:04:45.876 09:28:08 -- common/autotest_common.sh@936 -- # '[' -z 3156842 ']' 00:04:45.876 09:28:08 -- common/autotest_common.sh@940 -- # kill -0 3156842 00:04:45.876 09:28:08 -- common/autotest_common.sh@941 -- # uname 00:04:45.876 09:28:08 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:45.876 09:28:08 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3156842 00:04:45.876 09:28:08 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:04:45.876 09:28:08 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:04:45.876 09:28:08 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3156842' 00:04:45.876 killing process with pid 3156842 00:04:45.876 09:28:08 -- common/autotest_common.sh@955 -- # kill 3156842 00:04:45.876 09:28:08 -- common/autotest_common.sh@960 -- # wait 3156842 00:04:46.136 00:04:46.136 real 0m1.529s 00:04:46.136 user 0m1.586s 00:04:46.136 sys 0m0.463s 00:04:46.136 09:28:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:46.136 09:28:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.136 ************************************ 00:04:46.136 END TEST dpdk_mem_utility 00:04:46.136 ************************************ 00:04:46.136 09:28:08 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:04:46.136 09:28:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:46.136 09:28:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:46.136 09:28:08 -- common/autotest_common.sh@10 -- # set +x 00:04:46.395 ************************************ 00:04:46.395 START TEST event 00:04:46.395 ************************************ 00:04:46.395 09:28:08 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:04:46.395 * Looking for test storage... 00:04:46.395 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:04:46.395 09:28:09 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:46.395 09:28:09 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:46.395 09:28:09 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:46.395 09:28:09 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:46.395 09:28:09 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:46.395 09:28:09 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:46.395 09:28:09 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:46.395 09:28:09 -- scripts/common.sh@335 -- # IFS=.-: 00:04:46.395 09:28:09 -- scripts/common.sh@335 -- # read -ra ver1 00:04:46.395 09:28:09 -- scripts/common.sh@336 -- # IFS=.-: 00:04:46.395 09:28:09 -- scripts/common.sh@336 -- # read -ra ver2 00:04:46.395 09:28:09 -- scripts/common.sh@337 -- # local 'op=<' 00:04:46.395 09:28:09 -- scripts/common.sh@339 -- # ver1_l=2 00:04:46.395 09:28:09 -- scripts/common.sh@340 -- # ver2_l=1 00:04:46.395 09:28:09 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:46.395 09:28:09 -- scripts/common.sh@343 -- # case "$op" in 00:04:46.395 09:28:09 -- scripts/common.sh@344 -- # : 1 00:04:46.395 09:28:09 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:46.395 09:28:09 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:46.396 09:28:09 -- scripts/common.sh@364 -- # decimal 1 00:04:46.396 09:28:09 -- scripts/common.sh@352 -- # local d=1 00:04:46.396 09:28:09 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:46.396 09:28:09 -- scripts/common.sh@354 -- # echo 1 00:04:46.396 09:28:09 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:46.396 09:28:09 -- scripts/common.sh@365 -- # decimal 2 00:04:46.396 09:28:09 -- scripts/common.sh@352 -- # local d=2 00:04:46.396 09:28:09 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:46.396 09:28:09 -- scripts/common.sh@354 -- # echo 2 00:04:46.396 09:28:09 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:46.396 09:28:09 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:46.396 09:28:09 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:46.396 09:28:09 -- scripts/common.sh@367 -- # return 0 00:04:46.396 09:28:09 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:46.396 09:28:09 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:46.396 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:46.396 --rc genhtml_branch_coverage=1 00:04:46.396 --rc genhtml_function_coverage=1 00:04:46.396 --rc genhtml_legend=1 00:04:46.396 --rc geninfo_all_blocks=1 00:04:46.396 --rc geninfo_unexecuted_blocks=1 00:04:46.396 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:46.396 ' 00:04:46.396 09:28:09 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:46.396 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:46.396 --rc genhtml_branch_coverage=1 00:04:46.396 --rc genhtml_function_coverage=1 00:04:46.396 --rc genhtml_legend=1 00:04:46.396 --rc geninfo_all_blocks=1 00:04:46.396 --rc geninfo_unexecuted_blocks=1 00:04:46.396 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:46.396 ' 00:04:46.396 09:28:09 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:46.396 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:46.396 --rc genhtml_branch_coverage=1 00:04:46.396 --rc genhtml_function_coverage=1 00:04:46.396 --rc genhtml_legend=1 00:04:46.396 --rc geninfo_all_blocks=1 00:04:46.396 --rc geninfo_unexecuted_blocks=1 00:04:46.396 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:46.396 ' 00:04:46.396 09:28:09 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:46.396 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:46.396 --rc genhtml_branch_coverage=1 00:04:46.396 --rc genhtml_function_coverage=1 00:04:46.396 --rc genhtml_legend=1 00:04:46.396 --rc geninfo_all_blocks=1 00:04:46.396 --rc geninfo_unexecuted_blocks=1 00:04:46.396 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:46.396 ' 00:04:46.396 09:28:09 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:04:46.396 09:28:09 -- bdev/nbd_common.sh@6 -- # set -e 00:04:46.396 09:28:09 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:04:46.396 09:28:09 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:04:46.396 09:28:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:46.396 09:28:09 -- common/autotest_common.sh@10 -- # set +x 00:04:46.396 ************************************ 00:04:46.396 START TEST event_perf 00:04:46.396 ************************************ 00:04:46.396 09:28:09 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:04:46.396 Running I/O for 1 seconds...[2024-11-29 09:28:09.161183] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:46.396 [2024-11-29 09:28:09.161236] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157179 ] 00:04:46.396 EAL: No free 2048 kB hugepages reported on node 1 00:04:46.396 [2024-11-29 09:28:09.225423] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:04:46.656 [2024-11-29 09:28:09.299028] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:46.656 [2024-11-29 09:28:09.299124] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:04:46.656 [2024-11-29 09:28:09.299210] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:04:46.656 [2024-11-29 09:28:09.299212] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:47.594 Running I/O for 1 seconds... 00:04:47.594 lcore 0: 192473 00:04:47.594 lcore 1: 192474 00:04:47.594 lcore 2: 192474 00:04:47.594 lcore 3: 192474 00:04:47.594 done. 00:04:47.594 00:04:47.594 real 0m1.215s 00:04:47.594 user 0m4.120s 00:04:47.594 sys 0m0.091s 00:04:47.594 09:28:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:47.594 09:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:47.594 ************************************ 00:04:47.594 END TEST event_perf 00:04:47.594 ************************************ 00:04:47.594 09:28:10 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:04:47.594 09:28:10 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:04:47.594 09:28:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:47.594 09:28:10 -- common/autotest_common.sh@10 -- # set +x 00:04:47.594 ************************************ 00:04:47.594 START TEST event_reactor 00:04:47.594 ************************************ 00:04:47.594 09:28:10 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:04:47.594 [2024-11-29 09:28:10.435819] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:47.594 [2024-11-29 09:28:10.435917] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157464 ] 00:04:47.853 EAL: No free 2048 kB hugepages reported on node 1 00:04:47.853 [2024-11-29 09:28:10.508056] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:47.853 [2024-11-29 09:28:10.574472] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:49.230 test_start 00:04:49.230 oneshot 00:04:49.230 tick 100 00:04:49.230 tick 100 00:04:49.230 tick 250 00:04:49.230 tick 100 00:04:49.230 tick 100 00:04:49.230 tick 100 00:04:49.230 tick 250 00:04:49.230 tick 500 00:04:49.230 tick 100 00:04:49.230 tick 100 00:04:49.230 tick 250 00:04:49.230 tick 100 00:04:49.230 tick 100 00:04:49.230 test_end 00:04:49.230 00:04:49.230 real 0m1.218s 00:04:49.230 user 0m1.125s 00:04:49.230 sys 0m0.088s 00:04:49.230 09:28:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:49.230 09:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:49.230 ************************************ 00:04:49.230 END TEST event_reactor 00:04:49.230 ************************************ 00:04:49.230 09:28:11 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:04:49.230 09:28:11 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:04:49.230 09:28:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:49.230 09:28:11 -- common/autotest_common.sh@10 -- # set +x 00:04:49.230 ************************************ 00:04:49.230 START TEST event_reactor_perf 00:04:49.230 ************************************ 00:04:49.230 09:28:11 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:04:49.230 [2024-11-29 09:28:11.701375] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:49.230 [2024-11-29 09:28:11.701467] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3157752 ] 00:04:49.230 EAL: No free 2048 kB hugepages reported on node 1 00:04:49.230 [2024-11-29 09:28:11.771711] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:49.230 [2024-11-29 09:28:11.836493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:50.165 test_start 00:04:50.165 test_end 00:04:50.165 Performance: 971808 events per second 00:04:50.165 00:04:50.165 real 0m1.217s 00:04:50.165 user 0m1.130s 00:04:50.165 sys 0m0.082s 00:04:50.165 09:28:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:50.165 09:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:50.165 ************************************ 00:04:50.165 END TEST event_reactor_perf 00:04:50.165 ************************************ 00:04:50.165 09:28:12 -- event/event.sh@49 -- # uname -s 00:04:50.165 09:28:12 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:04:50.165 09:28:12 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:04:50.165 09:28:12 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:50.165 09:28:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:50.165 09:28:12 -- common/autotest_common.sh@10 -- # set +x 00:04:50.165 ************************************ 00:04:50.165 START TEST event_scheduler 00:04:50.165 ************************************ 00:04:50.165 09:28:12 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:04:50.425 * Looking for test storage... 00:04:50.425 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:04:50.425 09:28:13 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:50.425 09:28:13 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:50.425 09:28:13 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:50.425 09:28:13 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:50.425 09:28:13 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:50.425 09:28:13 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:50.425 09:28:13 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:50.425 09:28:13 -- scripts/common.sh@335 -- # IFS=.-: 00:04:50.425 09:28:13 -- scripts/common.sh@335 -- # read -ra ver1 00:04:50.425 09:28:13 -- scripts/common.sh@336 -- # IFS=.-: 00:04:50.425 09:28:13 -- scripts/common.sh@336 -- # read -ra ver2 00:04:50.425 09:28:13 -- scripts/common.sh@337 -- # local 'op=<' 00:04:50.425 09:28:13 -- scripts/common.sh@339 -- # ver1_l=2 00:04:50.425 09:28:13 -- scripts/common.sh@340 -- # ver2_l=1 00:04:50.425 09:28:13 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:50.425 09:28:13 -- scripts/common.sh@343 -- # case "$op" in 00:04:50.425 09:28:13 -- scripts/common.sh@344 -- # : 1 00:04:50.425 09:28:13 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:50.425 09:28:13 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:50.425 09:28:13 -- scripts/common.sh@364 -- # decimal 1 00:04:50.425 09:28:13 -- scripts/common.sh@352 -- # local d=1 00:04:50.425 09:28:13 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:50.425 09:28:13 -- scripts/common.sh@354 -- # echo 1 00:04:50.425 09:28:13 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:50.425 09:28:13 -- scripts/common.sh@365 -- # decimal 2 00:04:50.425 09:28:13 -- scripts/common.sh@352 -- # local d=2 00:04:50.425 09:28:13 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:50.425 09:28:13 -- scripts/common.sh@354 -- # echo 2 00:04:50.425 09:28:13 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:50.425 09:28:13 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:50.425 09:28:13 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:50.425 09:28:13 -- scripts/common.sh@367 -- # return 0 00:04:50.425 09:28:13 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:50.425 09:28:13 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:50.425 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.425 --rc genhtml_branch_coverage=1 00:04:50.425 --rc genhtml_function_coverage=1 00:04:50.425 --rc genhtml_legend=1 00:04:50.425 --rc geninfo_all_blocks=1 00:04:50.425 --rc geninfo_unexecuted_blocks=1 00:04:50.425 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.425 ' 00:04:50.425 09:28:13 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:50.425 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.425 --rc genhtml_branch_coverage=1 00:04:50.425 --rc genhtml_function_coverage=1 00:04:50.425 --rc genhtml_legend=1 00:04:50.425 --rc geninfo_all_blocks=1 00:04:50.425 --rc geninfo_unexecuted_blocks=1 00:04:50.425 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.425 ' 00:04:50.425 09:28:13 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:50.425 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.425 --rc genhtml_branch_coverage=1 00:04:50.425 --rc genhtml_function_coverage=1 00:04:50.425 --rc genhtml_legend=1 00:04:50.425 --rc geninfo_all_blocks=1 00:04:50.425 --rc geninfo_unexecuted_blocks=1 00:04:50.425 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.425 ' 00:04:50.425 09:28:13 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:50.425 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:50.425 --rc genhtml_branch_coverage=1 00:04:50.425 --rc genhtml_function_coverage=1 00:04:50.425 --rc genhtml_legend=1 00:04:50.425 --rc geninfo_all_blocks=1 00:04:50.425 --rc geninfo_unexecuted_blocks=1 00:04:50.425 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:50.426 ' 00:04:50.426 09:28:13 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:04:50.426 09:28:13 -- scheduler/scheduler.sh@35 -- # scheduler_pid=3158069 00:04:50.426 09:28:13 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:04:50.426 09:28:13 -- scheduler/scheduler.sh@37 -- # waitforlisten 3158069 00:04:50.426 09:28:13 -- common/autotest_common.sh@829 -- # '[' -z 3158069 ']' 00:04:50.426 09:28:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:50.426 09:28:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:50.426 09:28:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:50.426 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:50.426 09:28:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:50.426 09:28:13 -- common/autotest_common.sh@10 -- # set +x 00:04:50.426 09:28:13 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:04:50.426 [2024-11-29 09:28:13.159656] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:50.426 [2024-11-29 09:28:13.159749] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158069 ] 00:04:50.426 EAL: No free 2048 kB hugepages reported on node 1 00:04:50.426 [2024-11-29 09:28:13.225640] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:04:50.684 [2024-11-29 09:28:13.302493] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:50.684 [2024-11-29 09:28:13.302578] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:50.684 [2024-11-29 09:28:13.302665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:04:50.685 [2024-11-29 09:28:13.302668] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:04:51.252 09:28:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:51.252 09:28:14 -- common/autotest_common.sh@862 -- # return 0 00:04:51.252 09:28:14 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:04:51.252 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.252 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.252 POWER: Env isn't set yet! 00:04:51.252 POWER: Attempting to initialise ACPI cpufreq power management... 00:04:51.252 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:04:51.252 POWER: Cannot set governor of lcore 0 to userspace 00:04:51.252 POWER: Attempting to initialise PSTAT power management... 00:04:51.252 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:04:51.252 POWER: Initialized successfully for lcore 0 power management 00:04:51.252 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:04:51.252 POWER: Initialized successfully for lcore 1 power management 00:04:51.252 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:04:51.252 POWER: Initialized successfully for lcore 2 power management 00:04:51.252 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:04:51.252 POWER: Initialized successfully for lcore 3 power management 00:04:51.253 [2024-11-29 09:28:14.052841] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:04:51.253 [2024-11-29 09:28:14.052857] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:04:51.253 [2024-11-29 09:28:14.052868] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:04:51.253 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.253 09:28:14 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:04:51.253 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.253 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 [2024-11-29 09:28:14.121136] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:04:51.512 09:28:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:51.512 09:28:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 ************************************ 00:04:51.512 START TEST scheduler_create_thread 00:04:51.512 ************************************ 00:04:51.512 09:28:14 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 2 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 3 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 4 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 5 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 6 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 7 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 8 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 9 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 10 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:51.512 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:04:51.512 09:28:14 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:04:51.512 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:51.512 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:52.079 09:28:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:52.079 09:28:14 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:04:52.079 09:28:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:52.079 09:28:14 -- common/autotest_common.sh@10 -- # set +x 00:04:53.455 09:28:16 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:53.455 09:28:16 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:04:53.455 09:28:16 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:04:53.455 09:28:16 -- common/autotest_common.sh@561 -- # xtrace_disable 00:04:53.455 09:28:16 -- common/autotest_common.sh@10 -- # set +x 00:04:54.391 09:28:17 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:04:54.391 00:04:54.391 real 0m3.101s 00:04:54.391 user 0m0.022s 00:04:54.391 sys 0m0.009s 00:04:54.391 09:28:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:54.391 09:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:54.391 ************************************ 00:04:54.391 END TEST scheduler_create_thread 00:04:54.391 ************************************ 00:04:54.649 09:28:17 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:04:54.649 09:28:17 -- scheduler/scheduler.sh@46 -- # killprocess 3158069 00:04:54.649 09:28:17 -- common/autotest_common.sh@936 -- # '[' -z 3158069 ']' 00:04:54.649 09:28:17 -- common/autotest_common.sh@940 -- # kill -0 3158069 00:04:54.649 09:28:17 -- common/autotest_common.sh@941 -- # uname 00:04:54.649 09:28:17 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:04:54.649 09:28:17 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3158069 00:04:54.649 09:28:17 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:04:54.650 09:28:17 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:04:54.650 09:28:17 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3158069' 00:04:54.650 killing process with pid 3158069 00:04:54.650 09:28:17 -- common/autotest_common.sh@955 -- # kill 3158069 00:04:54.650 09:28:17 -- common/autotest_common.sh@960 -- # wait 3158069 00:04:54.909 [2024-11-29 09:28:17.612131] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:04:54.909 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:04:54.909 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:04:54.909 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:04:54.909 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:04:54.909 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:04:54.909 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:04:54.909 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:04:54.909 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:04:55.168 00:04:55.168 real 0m4.858s 00:04:55.168 user 0m9.439s 00:04:55.168 sys 0m0.455s 00:04:55.168 09:28:17 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:55.168 09:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:55.168 ************************************ 00:04:55.168 END TEST event_scheduler 00:04:55.168 ************************************ 00:04:55.168 09:28:17 -- event/event.sh@51 -- # modprobe -n nbd 00:04:55.168 09:28:17 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:04:55.168 09:28:17 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:55.168 09:28:17 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:55.168 09:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:55.168 ************************************ 00:04:55.168 START TEST app_repeat 00:04:55.168 ************************************ 00:04:55.168 09:28:17 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:04:55.168 09:28:17 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:55.168 09:28:17 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:55.168 09:28:17 -- event/event.sh@13 -- # local nbd_list 00:04:55.168 09:28:17 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:55.169 09:28:17 -- event/event.sh@14 -- # local bdev_list 00:04:55.169 09:28:17 -- event/event.sh@15 -- # local repeat_times=4 00:04:55.169 09:28:17 -- event/event.sh@17 -- # modprobe nbd 00:04:55.169 09:28:17 -- event/event.sh@19 -- # repeat_pid=3158933 00:04:55.169 09:28:17 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:04:55.169 09:28:17 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:04:55.169 09:28:17 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3158933' 00:04:55.169 Process app_repeat pid: 3158933 00:04:55.169 09:28:17 -- event/event.sh@23 -- # for i in {0..2} 00:04:55.169 09:28:17 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:04:55.169 spdk_app_start Round 0 00:04:55.169 09:28:17 -- event/event.sh@25 -- # waitforlisten 3158933 /var/tmp/spdk-nbd.sock 00:04:55.169 09:28:17 -- common/autotest_common.sh@829 -- # '[' -z 3158933 ']' 00:04:55.169 09:28:17 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:04:55.169 09:28:17 -- common/autotest_common.sh@834 -- # local max_retries=100 00:04:55.169 09:28:17 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:04:55.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:04:55.169 09:28:17 -- common/autotest_common.sh@838 -- # xtrace_disable 00:04:55.169 09:28:17 -- common/autotest_common.sh@10 -- # set +x 00:04:55.169 [2024-11-29 09:28:17.886532] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:04:55.169 [2024-11-29 09:28:17.886595] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3158933 ] 00:04:55.169 EAL: No free 2048 kB hugepages reported on node 1 00:04:55.169 [2024-11-29 09:28:17.953489] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:55.427 [2024-11-29 09:28:18.029618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:55.427 [2024-11-29 09:28:18.029619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:55.994 09:28:18 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:04:55.994 09:28:18 -- common/autotest_common.sh@862 -- # return 0 00:04:55.994 09:28:18 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:56.276 Malloc0 00:04:56.276 09:28:18 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:04:56.276 Malloc1 00:04:56.276 09:28:19 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:56.276 09:28:19 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:56.276 09:28:19 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:56.276 09:28:19 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@12 -- # local i 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:56.277 09:28:19 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:04:56.535 /dev/nbd0 00:04:56.535 09:28:19 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:04:56.535 09:28:19 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:04:56.535 09:28:19 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:04:56.535 09:28:19 -- common/autotest_common.sh@867 -- # local i 00:04:56.535 09:28:19 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:04:56.535 09:28:19 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:04:56.535 09:28:19 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:04:56.535 09:28:19 -- common/autotest_common.sh@871 -- # break 00:04:56.535 09:28:19 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:04:56.535 09:28:19 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:04:56.535 09:28:19 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:56.535 1+0 records in 00:04:56.535 1+0 records out 00:04:56.535 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220267 s, 18.6 MB/s 00:04:56.535 09:28:19 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:56.536 09:28:19 -- common/autotest_common.sh@884 -- # size=4096 00:04:56.536 09:28:19 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:56.536 09:28:19 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:04:56.536 09:28:19 -- common/autotest_common.sh@887 -- # return 0 00:04:56.536 09:28:19 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:56.536 09:28:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:56.536 09:28:19 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:04:56.795 /dev/nbd1 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:04:56.795 09:28:19 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:04:56.795 09:28:19 -- common/autotest_common.sh@867 -- # local i 00:04:56.795 09:28:19 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:04:56.795 09:28:19 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:04:56.795 09:28:19 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:04:56.795 09:28:19 -- common/autotest_common.sh@871 -- # break 00:04:56.795 09:28:19 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:04:56.795 09:28:19 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:04:56.795 09:28:19 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:04:56.795 1+0 records in 00:04:56.795 1+0 records out 00:04:56.795 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00026348 s, 15.5 MB/s 00:04:56.795 09:28:19 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:56.795 09:28:19 -- common/autotest_common.sh@884 -- # size=4096 00:04:56.795 09:28:19 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:04:56.795 09:28:19 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:04:56.795 09:28:19 -- common/autotest_common.sh@887 -- # return 0 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:56.795 09:28:19 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:04:57.054 { 00:04:57.054 "nbd_device": "/dev/nbd0", 00:04:57.054 "bdev_name": "Malloc0" 00:04:57.054 }, 00:04:57.054 { 00:04:57.054 "nbd_device": "/dev/nbd1", 00:04:57.054 "bdev_name": "Malloc1" 00:04:57.054 } 00:04:57.054 ]' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@64 -- # echo '[ 00:04:57.054 { 00:04:57.054 "nbd_device": "/dev/nbd0", 00:04:57.054 "bdev_name": "Malloc0" 00:04:57.054 }, 00:04:57.054 { 00:04:57.054 "nbd_device": "/dev/nbd1", 00:04:57.054 "bdev_name": "Malloc1" 00:04:57.054 } 00:04:57.054 ]' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:04:57.054 /dev/nbd1' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:04:57.054 /dev/nbd1' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@65 -- # count=2 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@66 -- # echo 2 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@95 -- # count=2 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@71 -- # local operation=write 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:04:57.054 256+0 records in 00:04:57.054 256+0 records out 00:04:57.054 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106392 s, 98.6 MB/s 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:04:57.054 256+0 records in 00:04:57.054 256+0 records out 00:04:57.054 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0193732 s, 54.1 MB/s 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:04:57.054 256+0 records in 00:04:57.054 256+0 records out 00:04:57.054 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209925 s, 50.0 MB/s 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@51 -- # local i 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:57.054 09:28:19 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@41 -- # break 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@45 -- # return 0 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:04:57.313 09:28:20 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@41 -- # break 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@45 -- # return 0 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:04:57.571 09:28:20 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@65 -- # echo '' 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@65 -- # true 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@65 -- # count=0 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@66 -- # echo 0 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@104 -- # count=0 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:04:57.828 09:28:20 -- bdev/nbd_common.sh@109 -- # return 0 00:04:57.828 09:28:20 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:04:57.828 09:28:20 -- event/event.sh@35 -- # sleep 3 00:04:58.086 [2024-11-29 09:28:20.854237] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:04:58.086 [2024-11-29 09:28:20.917645] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:04:58.086 [2024-11-29 09:28:20.917647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:04:58.344 [2024-11-29 09:28:20.958803] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:04:58.344 [2024-11-29 09:28:20.958849] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:00.874 09:28:23 -- event/event.sh@23 -- # for i in {0..2} 00:05:00.874 09:28:23 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:00.874 spdk_app_start Round 1 00:05:00.874 09:28:23 -- event/event.sh@25 -- # waitforlisten 3158933 /var/tmp/spdk-nbd.sock 00:05:00.874 09:28:23 -- common/autotest_common.sh@829 -- # '[' -z 3158933 ']' 00:05:00.874 09:28:23 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:00.874 09:28:23 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:00.874 09:28:23 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:00.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:00.874 09:28:23 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:00.874 09:28:23 -- common/autotest_common.sh@10 -- # set +x 00:05:01.133 09:28:23 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:01.133 09:28:23 -- common/autotest_common.sh@862 -- # return 0 00:05:01.133 09:28:23 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:01.442 Malloc0 00:05:01.442 09:28:24 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:01.442 Malloc1 00:05:01.442 09:28:24 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@12 -- # local i 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:01.442 09:28:24 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:01.740 /dev/nbd0 00:05:01.740 09:28:24 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:01.740 09:28:24 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:01.740 09:28:24 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:01.740 09:28:24 -- common/autotest_common.sh@867 -- # local i 00:05:01.740 09:28:24 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:01.740 09:28:24 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:01.740 09:28:24 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:01.740 09:28:24 -- common/autotest_common.sh@871 -- # break 00:05:01.740 09:28:24 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:01.740 09:28:24 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:01.740 09:28:24 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:01.740 1+0 records in 00:05:01.740 1+0 records out 00:05:01.740 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000162194 s, 25.3 MB/s 00:05:01.740 09:28:24 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:01.740 09:28:24 -- common/autotest_common.sh@884 -- # size=4096 00:05:01.740 09:28:24 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:01.740 09:28:24 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:01.740 09:28:24 -- common/autotest_common.sh@887 -- # return 0 00:05:01.740 09:28:24 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:01.740 09:28:24 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:01.740 09:28:24 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:02.025 /dev/nbd1 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:02.025 09:28:24 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:02.025 09:28:24 -- common/autotest_common.sh@867 -- # local i 00:05:02.025 09:28:24 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:02.025 09:28:24 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:02.025 09:28:24 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:02.025 09:28:24 -- common/autotest_common.sh@871 -- # break 00:05:02.025 09:28:24 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:02.025 09:28:24 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:02.025 09:28:24 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:02.025 1+0 records in 00:05:02.025 1+0 records out 00:05:02.025 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245504 s, 16.7 MB/s 00:05:02.025 09:28:24 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:02.025 09:28:24 -- common/autotest_common.sh@884 -- # size=4096 00:05:02.025 09:28:24 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:02.025 09:28:24 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:02.025 09:28:24 -- common/autotest_common.sh@887 -- # return 0 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:02.025 { 00:05:02.025 "nbd_device": "/dev/nbd0", 00:05:02.025 "bdev_name": "Malloc0" 00:05:02.025 }, 00:05:02.025 { 00:05:02.025 "nbd_device": "/dev/nbd1", 00:05:02.025 "bdev_name": "Malloc1" 00:05:02.025 } 00:05:02.025 ]' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:02.025 { 00:05:02.025 "nbd_device": "/dev/nbd0", 00:05:02.025 "bdev_name": "Malloc0" 00:05:02.025 }, 00:05:02.025 { 00:05:02.025 "nbd_device": "/dev/nbd1", 00:05:02.025 "bdev_name": "Malloc1" 00:05:02.025 } 00:05:02.025 ]' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:02.025 /dev/nbd1' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:02.025 /dev/nbd1' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@65 -- # count=2 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@95 -- # count=2 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:02.025 09:28:24 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:02.284 256+0 records in 00:05:02.284 256+0 records out 00:05:02.284 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109117 s, 96.1 MB/s 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:02.284 256+0 records in 00:05:02.284 256+0 records out 00:05:02.284 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197504 s, 53.1 MB/s 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:02.284 256+0 records in 00:05:02.284 256+0 records out 00:05:02.284 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209263 s, 50.1 MB/s 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@51 -- # local i 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:02.284 09:28:24 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@41 -- # break 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@45 -- # return 0 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@41 -- # break 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@45 -- # return 0 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:02.542 09:28:25 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@65 -- # true 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@65 -- # count=0 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@104 -- # count=0 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:02.800 09:28:25 -- bdev/nbd_common.sh@109 -- # return 0 00:05:02.800 09:28:25 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:03.059 09:28:25 -- event/event.sh@35 -- # sleep 3 00:05:03.317 [2024-11-29 09:28:25.951003] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:03.317 [2024-11-29 09:28:26.015953] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:03.317 [2024-11-29 09:28:26.015955] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:03.317 [2024-11-29 09:28:26.055964] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:03.317 [2024-11-29 09:28:26.056009] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:06.603 09:28:28 -- event/event.sh@23 -- # for i in {0..2} 00:05:06.603 09:28:28 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:06.603 spdk_app_start Round 2 00:05:06.603 09:28:28 -- event/event.sh@25 -- # waitforlisten 3158933 /var/tmp/spdk-nbd.sock 00:05:06.603 09:28:28 -- common/autotest_common.sh@829 -- # '[' -z 3158933 ']' 00:05:06.603 09:28:28 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:06.603 09:28:28 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:06.603 09:28:28 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:06.603 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:06.603 09:28:28 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:06.603 09:28:28 -- common/autotest_common.sh@10 -- # set +x 00:05:06.603 09:28:28 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:06.603 09:28:28 -- common/autotest_common.sh@862 -- # return 0 00:05:06.603 09:28:28 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:06.603 Malloc0 00:05:06.603 09:28:29 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:06.603 Malloc1 00:05:06.603 09:28:29 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@12 -- # local i 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:06.603 09:28:29 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:06.862 /dev/nbd0 00:05:06.862 09:28:29 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:06.862 09:28:29 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:06.862 09:28:29 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:06.862 09:28:29 -- common/autotest_common.sh@867 -- # local i 00:05:06.862 09:28:29 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:06.862 09:28:29 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:06.862 09:28:29 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:06.862 09:28:29 -- common/autotest_common.sh@871 -- # break 00:05:06.862 09:28:29 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:06.862 09:28:29 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:06.862 09:28:29 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:06.862 1+0 records in 00:05:06.862 1+0 records out 00:05:06.862 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024669 s, 16.6 MB/s 00:05:06.862 09:28:29 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:06.862 09:28:29 -- common/autotest_common.sh@884 -- # size=4096 00:05:06.862 09:28:29 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:06.862 09:28:29 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:06.862 09:28:29 -- common/autotest_common.sh@887 -- # return 0 00:05:06.862 09:28:29 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:06.862 09:28:29 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:06.862 09:28:29 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:07.121 /dev/nbd1 00:05:07.121 09:28:29 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:07.121 09:28:29 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:07.121 09:28:29 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:07.121 09:28:29 -- common/autotest_common.sh@867 -- # local i 00:05:07.122 09:28:29 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:07.122 09:28:29 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:07.122 09:28:29 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:07.122 09:28:29 -- common/autotest_common.sh@871 -- # break 00:05:07.122 09:28:29 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:07.122 09:28:29 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:07.122 09:28:29 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:07.122 1+0 records in 00:05:07.122 1+0 records out 00:05:07.122 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241266 s, 17.0 MB/s 00:05:07.122 09:28:29 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:07.122 09:28:29 -- common/autotest_common.sh@884 -- # size=4096 00:05:07.122 09:28:29 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:07.122 09:28:29 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:07.122 09:28:29 -- common/autotest_common.sh@887 -- # return 0 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:07.122 { 00:05:07.122 "nbd_device": "/dev/nbd0", 00:05:07.122 "bdev_name": "Malloc0" 00:05:07.122 }, 00:05:07.122 { 00:05:07.122 "nbd_device": "/dev/nbd1", 00:05:07.122 "bdev_name": "Malloc1" 00:05:07.122 } 00:05:07.122 ]' 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:07.122 { 00:05:07.122 "nbd_device": "/dev/nbd0", 00:05:07.122 "bdev_name": "Malloc0" 00:05:07.122 }, 00:05:07.122 { 00:05:07.122 "nbd_device": "/dev/nbd1", 00:05:07.122 "bdev_name": "Malloc1" 00:05:07.122 } 00:05:07.122 ]' 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:07.122 09:28:29 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:07.122 /dev/nbd1' 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:07.381 /dev/nbd1' 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@65 -- # count=2 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@95 -- # count=2 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:07.381 256+0 records in 00:05:07.381 256+0 records out 00:05:07.381 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105772 s, 99.1 MB/s 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:07.381 09:28:29 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:07.381 256+0 records in 00:05:07.381 256+0 records out 00:05:07.381 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0197072 s, 53.2 MB/s 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:07.381 256+0 records in 00:05:07.381 256+0 records out 00:05:07.381 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0219392 s, 47.8 MB/s 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@51 -- # local i 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:07.381 09:28:30 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@41 -- # break 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@45 -- # return 0 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@41 -- # break 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@45 -- # return 0 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:07.640 09:28:30 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@65 -- # true 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@65 -- # count=0 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@104 -- # count=0 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:07.899 09:28:30 -- bdev/nbd_common.sh@109 -- # return 0 00:05:07.899 09:28:30 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:08.158 09:28:30 -- event/event.sh@35 -- # sleep 3 00:05:08.415 [2024-11-29 09:28:31.057858] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:08.415 [2024-11-29 09:28:31.121484] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:08.415 [2024-11-29 09:28:31.121487] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:08.416 [2024-11-29 09:28:31.162293] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:08.416 [2024-11-29 09:28:31.162338] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:11.700 09:28:33 -- event/event.sh@38 -- # waitforlisten 3158933 /var/tmp/spdk-nbd.sock 00:05:11.700 09:28:33 -- common/autotest_common.sh@829 -- # '[' -z 3158933 ']' 00:05:11.700 09:28:33 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:11.700 09:28:33 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:11.700 09:28:33 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:11.700 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:11.700 09:28:33 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:11.700 09:28:33 -- common/autotest_common.sh@10 -- # set +x 00:05:11.700 09:28:34 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:11.700 09:28:34 -- common/autotest_common.sh@862 -- # return 0 00:05:11.700 09:28:34 -- event/event.sh@39 -- # killprocess 3158933 00:05:11.700 09:28:34 -- common/autotest_common.sh@936 -- # '[' -z 3158933 ']' 00:05:11.700 09:28:34 -- common/autotest_common.sh@940 -- # kill -0 3158933 00:05:11.700 09:28:34 -- common/autotest_common.sh@941 -- # uname 00:05:11.700 09:28:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:11.700 09:28:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3158933 00:05:11.700 09:28:34 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:11.700 09:28:34 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:11.700 09:28:34 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3158933' 00:05:11.700 killing process with pid 3158933 00:05:11.700 09:28:34 -- common/autotest_common.sh@955 -- # kill 3158933 00:05:11.700 09:28:34 -- common/autotest_common.sh@960 -- # wait 3158933 00:05:11.700 spdk_app_start is called in Round 0. 00:05:11.700 Shutdown signal received, stop current app iteration 00:05:11.700 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:11.700 spdk_app_start is called in Round 1. 00:05:11.700 Shutdown signal received, stop current app iteration 00:05:11.700 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:11.700 spdk_app_start is called in Round 2. 00:05:11.700 Shutdown signal received, stop current app iteration 00:05:11.700 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:05:11.700 spdk_app_start is called in Round 3. 00:05:11.700 Shutdown signal received, stop current app iteration 00:05:11.700 09:28:34 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:11.700 09:28:34 -- event/event.sh@42 -- # return 0 00:05:11.700 00:05:11.700 real 0m16.425s 00:05:11.700 user 0m35.079s 00:05:11.700 sys 0m2.991s 00:05:11.700 09:28:34 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:11.700 09:28:34 -- common/autotest_common.sh@10 -- # set +x 00:05:11.700 ************************************ 00:05:11.700 END TEST app_repeat 00:05:11.700 ************************************ 00:05:11.700 09:28:34 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:11.700 09:28:34 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:11.700 09:28:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:11.700 09:28:34 -- common/autotest_common.sh@10 -- # set +x 00:05:11.700 ************************************ 00:05:11.700 START TEST cpu_locks 00:05:11.700 ************************************ 00:05:11.700 09:28:34 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:05:11.700 * Looking for test storage... 00:05:11.700 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:11.700 09:28:34 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:11.700 09:28:34 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:11.700 09:28:34 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:11.700 09:28:34 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:11.700 09:28:34 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:11.700 09:28:34 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:11.700 09:28:34 -- scripts/common.sh@335 -- # IFS=.-: 00:05:11.700 09:28:34 -- scripts/common.sh@335 -- # read -ra ver1 00:05:11.700 09:28:34 -- scripts/common.sh@336 -- # IFS=.-: 00:05:11.700 09:28:34 -- scripts/common.sh@336 -- # read -ra ver2 00:05:11.700 09:28:34 -- scripts/common.sh@337 -- # local 'op=<' 00:05:11.700 09:28:34 -- scripts/common.sh@339 -- # ver1_l=2 00:05:11.700 09:28:34 -- scripts/common.sh@340 -- # ver2_l=1 00:05:11.700 09:28:34 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:11.700 09:28:34 -- scripts/common.sh@343 -- # case "$op" in 00:05:11.700 09:28:34 -- scripts/common.sh@344 -- # : 1 00:05:11.700 09:28:34 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:11.700 09:28:34 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:11.700 09:28:34 -- scripts/common.sh@364 -- # decimal 1 00:05:11.700 09:28:34 -- scripts/common.sh@352 -- # local d=1 00:05:11.700 09:28:34 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:11.700 09:28:34 -- scripts/common.sh@354 -- # echo 1 00:05:11.700 09:28:34 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:11.700 09:28:34 -- scripts/common.sh@365 -- # decimal 2 00:05:11.700 09:28:34 -- scripts/common.sh@352 -- # local d=2 00:05:11.700 09:28:34 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:11.700 09:28:34 -- scripts/common.sh@354 -- # echo 2 00:05:11.700 09:28:34 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:11.700 09:28:34 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:11.700 09:28:34 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:11.700 09:28:34 -- scripts/common.sh@367 -- # return 0 00:05:11.700 09:28:34 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:11.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.700 --rc genhtml_branch_coverage=1 00:05:11.700 --rc genhtml_function_coverage=1 00:05:11.700 --rc genhtml_legend=1 00:05:11.700 --rc geninfo_all_blocks=1 00:05:11.700 --rc geninfo_unexecuted_blocks=1 00:05:11.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.700 ' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:11.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.700 --rc genhtml_branch_coverage=1 00:05:11.700 --rc genhtml_function_coverage=1 00:05:11.700 --rc genhtml_legend=1 00:05:11.700 --rc geninfo_all_blocks=1 00:05:11.700 --rc geninfo_unexecuted_blocks=1 00:05:11.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.700 ' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:11.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.700 --rc genhtml_branch_coverage=1 00:05:11.700 --rc genhtml_function_coverage=1 00:05:11.700 --rc genhtml_legend=1 00:05:11.700 --rc geninfo_all_blocks=1 00:05:11.700 --rc geninfo_unexecuted_blocks=1 00:05:11.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.700 ' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:11.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:11.700 --rc genhtml_branch_coverage=1 00:05:11.700 --rc genhtml_function_coverage=1 00:05:11.700 --rc genhtml_legend=1 00:05:11.700 --rc geninfo_all_blocks=1 00:05:11.700 --rc geninfo_unexecuted_blocks=1 00:05:11.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:11.700 ' 00:05:11.700 09:28:34 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:11.700 09:28:34 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:11.700 09:28:34 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:11.700 09:28:34 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:11.700 09:28:34 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:11.700 09:28:34 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:11.700 09:28:34 -- common/autotest_common.sh@10 -- # set +x 00:05:11.700 ************************************ 00:05:11.700 START TEST default_locks 00:05:11.700 ************************************ 00:05:11.700 09:28:34 -- common/autotest_common.sh@1114 -- # default_locks 00:05:11.700 09:28:34 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3162142 00:05:11.700 09:28:34 -- event/cpu_locks.sh@47 -- # waitforlisten 3162142 00:05:11.700 09:28:34 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:11.700 09:28:34 -- common/autotest_common.sh@829 -- # '[' -z 3162142 ']' 00:05:11.700 09:28:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:11.700 09:28:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:11.700 09:28:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:11.700 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:11.700 09:28:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:11.959 09:28:34 -- common/autotest_common.sh@10 -- # set +x 00:05:11.959 [2024-11-29 09:28:34.564848] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:11.959 [2024-11-29 09:28:34.564938] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162142 ] 00:05:11.959 EAL: No free 2048 kB hugepages reported on node 1 00:05:11.959 [2024-11-29 09:28:34.633459] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:11.959 [2024-11-29 09:28:34.708071] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:11.960 [2024-11-29 09:28:34.708184] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:12.894 09:28:35 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:12.894 09:28:35 -- common/autotest_common.sh@862 -- # return 0 00:05:12.894 09:28:35 -- event/cpu_locks.sh@49 -- # locks_exist 3162142 00:05:12.894 09:28:35 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:12.894 09:28:35 -- event/cpu_locks.sh@22 -- # lslocks -p 3162142 00:05:12.894 lslocks: write error 00:05:12.894 09:28:35 -- event/cpu_locks.sh@50 -- # killprocess 3162142 00:05:12.894 09:28:35 -- common/autotest_common.sh@936 -- # '[' -z 3162142 ']' 00:05:12.894 09:28:35 -- common/autotest_common.sh@940 -- # kill -0 3162142 00:05:12.895 09:28:35 -- common/autotest_common.sh@941 -- # uname 00:05:12.895 09:28:35 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:12.895 09:28:35 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3162142 00:05:12.895 09:28:35 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:12.895 09:28:35 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:12.895 09:28:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3162142' 00:05:12.895 killing process with pid 3162142 00:05:12.895 09:28:35 -- common/autotest_common.sh@955 -- # kill 3162142 00:05:12.895 09:28:35 -- common/autotest_common.sh@960 -- # wait 3162142 00:05:13.463 09:28:36 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3162142 00:05:13.463 09:28:36 -- common/autotest_common.sh@650 -- # local es=0 00:05:13.463 09:28:36 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3162142 00:05:13.463 09:28:36 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:13.463 09:28:36 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:13.463 09:28:36 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:13.463 09:28:36 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:13.463 09:28:36 -- common/autotest_common.sh@653 -- # waitforlisten 3162142 00:05:13.463 09:28:36 -- common/autotest_common.sh@829 -- # '[' -z 3162142 ']' 00:05:13.463 09:28:36 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:13.463 09:28:36 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:13.463 09:28:36 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:13.463 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:13.463 09:28:36 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:13.463 09:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:13.463 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (3162142) - No such process 00:05:13.463 ERROR: process (pid: 3162142) is no longer running 00:05:13.463 09:28:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:13.463 09:28:36 -- common/autotest_common.sh@862 -- # return 1 00:05:13.463 09:28:36 -- common/autotest_common.sh@653 -- # es=1 00:05:13.463 09:28:36 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:13.463 09:28:36 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:13.463 09:28:36 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:13.463 09:28:36 -- event/cpu_locks.sh@54 -- # no_locks 00:05:13.463 09:28:36 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:13.463 09:28:36 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:13.463 09:28:36 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:13.463 00:05:13.463 real 0m1.479s 00:05:13.463 user 0m1.568s 00:05:13.463 sys 0m0.510s 00:05:13.463 09:28:36 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:13.463 09:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:13.463 ************************************ 00:05:13.463 END TEST default_locks 00:05:13.463 ************************************ 00:05:13.463 09:28:36 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:13.463 09:28:36 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:13.463 09:28:36 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:13.463 09:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:13.463 ************************************ 00:05:13.463 START TEST default_locks_via_rpc 00:05:13.463 ************************************ 00:05:13.463 09:28:36 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:05:13.463 09:28:36 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3162438 00:05:13.463 09:28:36 -- event/cpu_locks.sh@63 -- # waitforlisten 3162438 00:05:13.463 09:28:36 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:13.463 09:28:36 -- common/autotest_common.sh@829 -- # '[' -z 3162438 ']' 00:05:13.463 09:28:36 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:13.463 09:28:36 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:13.463 09:28:36 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:13.463 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:13.463 09:28:36 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:13.463 09:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:13.463 [2024-11-29 09:28:36.090005] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:13.463 [2024-11-29 09:28:36.090090] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162438 ] 00:05:13.463 EAL: No free 2048 kB hugepages reported on node 1 00:05:13.463 [2024-11-29 09:28:36.158920] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:13.463 [2024-11-29 09:28:36.233317] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:13.463 [2024-11-29 09:28:36.233429] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:14.400 09:28:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:14.400 09:28:36 -- common/autotest_common.sh@862 -- # return 0 00:05:14.400 09:28:36 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:14.400 09:28:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.400 09:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.400 09:28:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:14.400 09:28:36 -- event/cpu_locks.sh@67 -- # no_locks 00:05:14.400 09:28:36 -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:14.400 09:28:36 -- event/cpu_locks.sh@26 -- # local lock_files 00:05:14.400 09:28:36 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:14.400 09:28:36 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:14.400 09:28:36 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:14.400 09:28:36 -- common/autotest_common.sh@10 -- # set +x 00:05:14.400 09:28:36 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:14.400 09:28:36 -- event/cpu_locks.sh@71 -- # locks_exist 3162438 00:05:14.400 09:28:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:14.400 09:28:36 -- event/cpu_locks.sh@22 -- # lslocks -p 3162438 00:05:14.659 09:28:37 -- event/cpu_locks.sh@73 -- # killprocess 3162438 00:05:14.659 09:28:37 -- common/autotest_common.sh@936 -- # '[' -z 3162438 ']' 00:05:14.659 09:28:37 -- common/autotest_common.sh@940 -- # kill -0 3162438 00:05:14.659 09:28:37 -- common/autotest_common.sh@941 -- # uname 00:05:14.659 09:28:37 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:14.659 09:28:37 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3162438 00:05:14.659 09:28:37 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:14.659 09:28:37 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:14.659 09:28:37 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3162438' 00:05:14.659 killing process with pid 3162438 00:05:14.659 09:28:37 -- common/autotest_common.sh@955 -- # kill 3162438 00:05:14.659 09:28:37 -- common/autotest_common.sh@960 -- # wait 3162438 00:05:14.919 00:05:14.919 real 0m1.610s 00:05:14.919 user 0m1.692s 00:05:14.919 sys 0m0.560s 00:05:14.919 09:28:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:14.919 09:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:14.919 ************************************ 00:05:14.919 END TEST default_locks_via_rpc 00:05:14.919 ************************************ 00:05:14.919 09:28:37 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:14.919 09:28:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:14.919 09:28:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:14.919 09:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:14.919 ************************************ 00:05:14.919 START TEST non_locking_app_on_locked_coremask 00:05:14.919 ************************************ 00:05:14.919 09:28:37 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:05:14.919 09:28:37 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3162741 00:05:14.919 09:28:37 -- event/cpu_locks.sh@81 -- # waitforlisten 3162741 /var/tmp/spdk.sock 00:05:14.919 09:28:37 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:14.919 09:28:37 -- common/autotest_common.sh@829 -- # '[' -z 3162741 ']' 00:05:14.919 09:28:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.919 09:28:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:14.919 09:28:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.919 09:28:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:14.919 09:28:37 -- common/autotest_common.sh@10 -- # set +x 00:05:14.919 [2024-11-29 09:28:37.747616] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:14.919 [2024-11-29 09:28:37.747702] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162741 ] 00:05:15.179 EAL: No free 2048 kB hugepages reported on node 1 00:05:15.179 [2024-11-29 09:28:37.815556] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:15.179 [2024-11-29 09:28:37.890327] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:15.179 [2024-11-29 09:28:37.890442] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:15.747 09:28:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:15.747 09:28:38 -- common/autotest_common.sh@862 -- # return 0 00:05:15.747 09:28:38 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3162895 00:05:15.747 09:28:38 -- event/cpu_locks.sh@85 -- # waitforlisten 3162895 /var/tmp/spdk2.sock 00:05:15.747 09:28:38 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:15.747 09:28:38 -- common/autotest_common.sh@829 -- # '[' -z 3162895 ']' 00:05:15.747 09:28:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:15.747 09:28:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:15.747 09:28:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:15.747 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:15.747 09:28:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:15.747 09:28:38 -- common/autotest_common.sh@10 -- # set +x 00:05:16.006 [2024-11-29 09:28:38.599250] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:16.006 [2024-11-29 09:28:38.599334] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3162895 ] 00:05:16.006 EAL: No free 2048 kB hugepages reported on node 1 00:05:16.006 [2024-11-29 09:28:38.691971] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:16.006 [2024-11-29 09:28:38.691999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:16.006 [2024-11-29 09:28:38.837029] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:16.006 [2024-11-29 09:28:38.837157] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:16.574 09:28:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:16.574 09:28:39 -- common/autotest_common.sh@862 -- # return 0 00:05:16.574 09:28:39 -- event/cpu_locks.sh@87 -- # locks_exist 3162741 00:05:16.574 09:28:39 -- event/cpu_locks.sh@22 -- # lslocks -p 3162741 00:05:16.574 09:28:39 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:17.510 lslocks: write error 00:05:17.510 09:28:40 -- event/cpu_locks.sh@89 -- # killprocess 3162741 00:05:17.510 09:28:40 -- common/autotest_common.sh@936 -- # '[' -z 3162741 ']' 00:05:17.510 09:28:40 -- common/autotest_common.sh@940 -- # kill -0 3162741 00:05:17.510 09:28:40 -- common/autotest_common.sh@941 -- # uname 00:05:17.510 09:28:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:17.510 09:28:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3162741 00:05:17.510 09:28:40 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:17.510 09:28:40 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:17.510 09:28:40 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3162741' 00:05:17.510 killing process with pid 3162741 00:05:17.510 09:28:40 -- common/autotest_common.sh@955 -- # kill 3162741 00:05:17.510 09:28:40 -- common/autotest_common.sh@960 -- # wait 3162741 00:05:18.447 09:28:40 -- event/cpu_locks.sh@90 -- # killprocess 3162895 00:05:18.447 09:28:40 -- common/autotest_common.sh@936 -- # '[' -z 3162895 ']' 00:05:18.447 09:28:40 -- common/autotest_common.sh@940 -- # kill -0 3162895 00:05:18.447 09:28:40 -- common/autotest_common.sh@941 -- # uname 00:05:18.447 09:28:40 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:18.447 09:28:40 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3162895 00:05:18.447 09:28:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:18.447 09:28:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:18.447 09:28:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3162895' 00:05:18.447 killing process with pid 3162895 00:05:18.447 09:28:41 -- common/autotest_common.sh@955 -- # kill 3162895 00:05:18.447 09:28:41 -- common/autotest_common.sh@960 -- # wait 3162895 00:05:18.706 00:05:18.706 real 0m3.578s 00:05:18.706 user 0m3.829s 00:05:18.706 sys 0m1.160s 00:05:18.706 09:28:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:18.706 09:28:41 -- common/autotest_common.sh@10 -- # set +x 00:05:18.706 ************************************ 00:05:18.706 END TEST non_locking_app_on_locked_coremask 00:05:18.706 ************************************ 00:05:18.706 09:28:41 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:18.706 09:28:41 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:18.706 09:28:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:18.706 09:28:41 -- common/autotest_common.sh@10 -- # set +x 00:05:18.706 ************************************ 00:05:18.706 START TEST locking_app_on_unlocked_coremask 00:05:18.706 ************************************ 00:05:18.706 09:28:41 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:05:18.706 09:28:41 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3163330 00:05:18.706 09:28:41 -- event/cpu_locks.sh@99 -- # waitforlisten 3163330 /var/tmp/spdk.sock 00:05:18.706 09:28:41 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:18.706 09:28:41 -- common/autotest_common.sh@829 -- # '[' -z 3163330 ']' 00:05:18.706 09:28:41 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:18.706 09:28:41 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:18.706 09:28:41 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:18.706 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:18.706 09:28:41 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:18.706 09:28:41 -- common/autotest_common.sh@10 -- # set +x 00:05:18.706 [2024-11-29 09:28:41.374287] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:18.706 [2024-11-29 09:28:41.374359] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163330 ] 00:05:18.706 EAL: No free 2048 kB hugepages reported on node 1 00:05:18.706 [2024-11-29 09:28:41.442621] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:18.706 [2024-11-29 09:28:41.442647] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.706 [2024-11-29 09:28:41.517012] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:18.706 [2024-11-29 09:28:41.517151] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.642 09:28:42 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:19.642 09:28:42 -- common/autotest_common.sh@862 -- # return 0 00:05:19.642 09:28:42 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3163594 00:05:19.642 09:28:42 -- event/cpu_locks.sh@103 -- # waitforlisten 3163594 /var/tmp/spdk2.sock 00:05:19.642 09:28:42 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:19.642 09:28:42 -- common/autotest_common.sh@829 -- # '[' -z 3163594 ']' 00:05:19.642 09:28:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:19.642 09:28:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:19.642 09:28:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:19.642 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:19.642 09:28:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:19.642 09:28:42 -- common/autotest_common.sh@10 -- # set +x 00:05:19.642 [2024-11-29 09:28:42.230904] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:19.642 [2024-11-29 09:28:42.230990] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3163594 ] 00:05:19.642 EAL: No free 2048 kB hugepages reported on node 1 00:05:19.642 [2024-11-29 09:28:42.319449] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:19.642 [2024-11-29 09:28:42.456156] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:19.642 [2024-11-29 09:28:42.456283] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:20.579 09:28:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:20.579 09:28:43 -- common/autotest_common.sh@862 -- # return 0 00:05:20.579 09:28:43 -- event/cpu_locks.sh@105 -- # locks_exist 3163594 00:05:20.579 09:28:43 -- event/cpu_locks.sh@22 -- # lslocks -p 3163594 00:05:20.579 09:28:43 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:20.838 lslocks: write error 00:05:20.838 09:28:43 -- event/cpu_locks.sh@107 -- # killprocess 3163330 00:05:20.838 09:28:43 -- common/autotest_common.sh@936 -- # '[' -z 3163330 ']' 00:05:20.838 09:28:43 -- common/autotest_common.sh@940 -- # kill -0 3163330 00:05:20.838 09:28:43 -- common/autotest_common.sh@941 -- # uname 00:05:20.838 09:28:43 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:20.838 09:28:43 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3163330 00:05:20.838 09:28:43 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:20.838 09:28:43 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:20.838 09:28:43 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3163330' 00:05:20.838 killing process with pid 3163330 00:05:20.838 09:28:43 -- common/autotest_common.sh@955 -- # kill 3163330 00:05:20.838 09:28:43 -- common/autotest_common.sh@960 -- # wait 3163330 00:05:21.776 09:28:44 -- event/cpu_locks.sh@108 -- # killprocess 3163594 00:05:21.776 09:28:44 -- common/autotest_common.sh@936 -- # '[' -z 3163594 ']' 00:05:21.776 09:28:44 -- common/autotest_common.sh@940 -- # kill -0 3163594 00:05:21.776 09:28:44 -- common/autotest_common.sh@941 -- # uname 00:05:21.776 09:28:44 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:21.776 09:28:44 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3163594 00:05:21.776 09:28:44 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:21.776 09:28:44 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:21.776 09:28:44 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3163594' 00:05:21.776 killing process with pid 3163594 00:05:21.776 09:28:44 -- common/autotest_common.sh@955 -- # kill 3163594 00:05:21.776 09:28:44 -- common/autotest_common.sh@960 -- # wait 3163594 00:05:22.035 00:05:22.035 real 0m3.270s 00:05:22.035 user 0m3.504s 00:05:22.035 sys 0m1.022s 00:05:22.035 09:28:44 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:22.035 09:28:44 -- common/autotest_common.sh@10 -- # set +x 00:05:22.035 ************************************ 00:05:22.035 END TEST locking_app_on_unlocked_coremask 00:05:22.035 ************************************ 00:05:22.035 09:28:44 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:22.035 09:28:44 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:22.035 09:28:44 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:22.035 09:28:44 -- common/autotest_common.sh@10 -- # set +x 00:05:22.035 ************************************ 00:05:22.035 START TEST locking_app_on_locked_coremask 00:05:22.035 ************************************ 00:05:22.035 09:28:44 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:05:22.035 09:28:44 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3164051 00:05:22.035 09:28:44 -- event/cpu_locks.sh@116 -- # waitforlisten 3164051 /var/tmp/spdk.sock 00:05:22.035 09:28:44 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:22.035 09:28:44 -- common/autotest_common.sh@829 -- # '[' -z 3164051 ']' 00:05:22.035 09:28:44 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:22.035 09:28:44 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:22.035 09:28:44 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:22.035 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:22.035 09:28:44 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:22.035 09:28:44 -- common/autotest_common.sh@10 -- # set +x 00:05:22.035 [2024-11-29 09:28:44.695354] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:22.035 [2024-11-29 09:28:44.695423] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164051 ] 00:05:22.035 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.035 [2024-11-29 09:28:44.763504] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:22.035 [2024-11-29 09:28:44.833581] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:22.035 [2024-11-29 09:28:44.833707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.970 09:28:45 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:22.970 09:28:45 -- common/autotest_common.sh@862 -- # return 0 00:05:22.970 09:28:45 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:22.970 09:28:45 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3164172 00:05:22.970 09:28:45 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3164172 /var/tmp/spdk2.sock 00:05:22.970 09:28:45 -- common/autotest_common.sh@650 -- # local es=0 00:05:22.970 09:28:45 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3164172 /var/tmp/spdk2.sock 00:05:22.970 09:28:45 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:22.970 09:28:45 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:22.970 09:28:45 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:22.970 09:28:45 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:22.970 09:28:45 -- common/autotest_common.sh@653 -- # waitforlisten 3164172 /var/tmp/spdk2.sock 00:05:22.970 09:28:45 -- common/autotest_common.sh@829 -- # '[' -z 3164172 ']' 00:05:22.970 09:28:45 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:22.970 09:28:45 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:22.970 09:28:45 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:22.970 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:22.970 09:28:45 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:22.970 09:28:45 -- common/autotest_common.sh@10 -- # set +x 00:05:22.970 [2024-11-29 09:28:45.548214] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:22.970 [2024-11-29 09:28:45.548298] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164172 ] 00:05:22.970 EAL: No free 2048 kB hugepages reported on node 1 00:05:22.970 [2024-11-29 09:28:45.637490] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3164051 has claimed it. 00:05:22.970 [2024-11-29 09:28:45.637530] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:23.536 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (3164172) - No such process 00:05:23.536 ERROR: process (pid: 3164172) is no longer running 00:05:23.536 09:28:46 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:23.536 09:28:46 -- common/autotest_common.sh@862 -- # return 1 00:05:23.536 09:28:46 -- common/autotest_common.sh@653 -- # es=1 00:05:23.536 09:28:46 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:23.536 09:28:46 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:23.537 09:28:46 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:23.537 09:28:46 -- event/cpu_locks.sh@122 -- # locks_exist 3164051 00:05:23.537 09:28:46 -- event/cpu_locks.sh@22 -- # lslocks -p 3164051 00:05:23.537 09:28:46 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:24.103 lslocks: write error 00:05:24.103 09:28:46 -- event/cpu_locks.sh@124 -- # killprocess 3164051 00:05:24.103 09:28:46 -- common/autotest_common.sh@936 -- # '[' -z 3164051 ']' 00:05:24.103 09:28:46 -- common/autotest_common.sh@940 -- # kill -0 3164051 00:05:24.103 09:28:46 -- common/autotest_common.sh@941 -- # uname 00:05:24.103 09:28:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:24.103 09:28:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3164051 00:05:24.103 09:28:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:24.103 09:28:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:24.103 09:28:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3164051' 00:05:24.103 killing process with pid 3164051 00:05:24.103 09:28:46 -- common/autotest_common.sh@955 -- # kill 3164051 00:05:24.103 09:28:46 -- common/autotest_common.sh@960 -- # wait 3164051 00:05:24.362 00:05:24.362 real 0m2.357s 00:05:24.362 user 0m2.602s 00:05:24.362 sys 0m0.688s 00:05:24.362 09:28:47 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:24.362 09:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:24.363 ************************************ 00:05:24.363 END TEST locking_app_on_locked_coremask 00:05:24.363 ************************************ 00:05:24.363 09:28:47 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:24.363 09:28:47 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:24.363 09:28:47 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:24.363 09:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:24.363 ************************************ 00:05:24.363 START TEST locking_overlapped_coremask 00:05:24.363 ************************************ 00:05:24.363 09:28:47 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:05:24.363 09:28:47 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3164476 00:05:24.363 09:28:47 -- event/cpu_locks.sh@133 -- # waitforlisten 3164476 /var/tmp/spdk.sock 00:05:24.363 09:28:47 -- common/autotest_common.sh@829 -- # '[' -z 3164476 ']' 00:05:24.363 09:28:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:24.363 09:28:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:24.363 09:28:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:24.363 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:24.363 09:28:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:24.363 09:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:24.363 09:28:47 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:05:24.363 [2024-11-29 09:28:47.094430] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:24.363 [2024-11-29 09:28:47.094501] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164476 ] 00:05:24.363 EAL: No free 2048 kB hugepages reported on node 1 00:05:24.363 [2024-11-29 09:28:47.160481] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:24.621 [2024-11-29 09:28:47.230785] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:24.621 [2024-11-29 09:28:47.230934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:24.622 [2024-11-29 09:28:47.231045] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:24.622 [2024-11-29 09:28:47.231048] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:25.189 09:28:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:25.189 09:28:47 -- common/autotest_common.sh@862 -- # return 0 00:05:25.189 09:28:47 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3164741 00:05:25.189 09:28:47 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3164741 /var/tmp/spdk2.sock 00:05:25.189 09:28:47 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:05:25.189 09:28:47 -- common/autotest_common.sh@650 -- # local es=0 00:05:25.189 09:28:47 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3164741 /var/tmp/spdk2.sock 00:05:25.189 09:28:47 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:05:25.189 09:28:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:25.189 09:28:47 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:05:25.189 09:28:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:25.189 09:28:47 -- common/autotest_common.sh@653 -- # waitforlisten 3164741 /var/tmp/spdk2.sock 00:05:25.189 09:28:47 -- common/autotest_common.sh@829 -- # '[' -z 3164741 ']' 00:05:25.189 09:28:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:25.189 09:28:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:25.189 09:28:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:25.189 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:25.189 09:28:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:25.189 09:28:47 -- common/autotest_common.sh@10 -- # set +x 00:05:25.189 [2024-11-29 09:28:47.945910] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:25.189 [2024-11-29 09:28:47.945984] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164741 ] 00:05:25.189 EAL: No free 2048 kB hugepages reported on node 1 00:05:25.448 [2024-11-29 09:28:48.040420] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3164476 has claimed it. 00:05:25.448 [2024-11-29 09:28:48.040454] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:26.017 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (3164741) - No such process 00:05:26.017 ERROR: process (pid: 3164741) is no longer running 00:05:26.017 09:28:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:26.017 09:28:48 -- common/autotest_common.sh@862 -- # return 1 00:05:26.017 09:28:48 -- common/autotest_common.sh@653 -- # es=1 00:05:26.017 09:28:48 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:26.017 09:28:48 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:26.017 09:28:48 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:26.017 09:28:48 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:05:26.017 09:28:48 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:26.017 09:28:48 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:26.017 09:28:48 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:26.017 09:28:48 -- event/cpu_locks.sh@141 -- # killprocess 3164476 00:05:26.017 09:28:48 -- common/autotest_common.sh@936 -- # '[' -z 3164476 ']' 00:05:26.017 09:28:48 -- common/autotest_common.sh@940 -- # kill -0 3164476 00:05:26.017 09:28:48 -- common/autotest_common.sh@941 -- # uname 00:05:26.017 09:28:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:26.017 09:28:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3164476 00:05:26.017 09:28:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:26.017 09:28:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:26.017 09:28:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3164476' 00:05:26.017 killing process with pid 3164476 00:05:26.017 09:28:48 -- common/autotest_common.sh@955 -- # kill 3164476 00:05:26.017 09:28:48 -- common/autotest_common.sh@960 -- # wait 3164476 00:05:26.276 00:05:26.276 real 0m1.902s 00:05:26.276 user 0m5.405s 00:05:26.276 sys 0m0.452s 00:05:26.276 09:28:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:26.276 09:28:48 -- common/autotest_common.sh@10 -- # set +x 00:05:26.276 ************************************ 00:05:26.276 END TEST locking_overlapped_coremask 00:05:26.276 ************************************ 00:05:26.276 09:28:49 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:05:26.276 09:28:49 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.276 09:28:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.276 09:28:49 -- common/autotest_common.sh@10 -- # set +x 00:05:26.276 ************************************ 00:05:26.276 START TEST locking_overlapped_coremask_via_rpc 00:05:26.276 ************************************ 00:05:26.276 09:28:49 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:05:26.276 09:28:49 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3164831 00:05:26.276 09:28:49 -- event/cpu_locks.sh@149 -- # waitforlisten 3164831 /var/tmp/spdk.sock 00:05:26.276 09:28:49 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:05:26.276 09:28:49 -- common/autotest_common.sh@829 -- # '[' -z 3164831 ']' 00:05:26.277 09:28:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:26.277 09:28:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:26.277 09:28:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:26.277 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:26.277 09:28:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:26.277 09:28:49 -- common/autotest_common.sh@10 -- # set +x 00:05:26.277 [2024-11-29 09:28:49.052082] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:26.277 [2024-11-29 09:28:49.052178] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3164831 ] 00:05:26.277 EAL: No free 2048 kB hugepages reported on node 1 00:05:26.535 [2024-11-29 09:28:49.123411] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:26.535 [2024-11-29 09:28:49.123438] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:26.535 [2024-11-29 09:28:49.199991] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:26.535 [2024-11-29 09:28:49.200135] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:26.535 [2024-11-29 09:28:49.200228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:26.535 [2024-11-29 09:28:49.200230] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.103 09:28:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:27.103 09:28:49 -- common/autotest_common.sh@862 -- # return 0 00:05:27.103 09:28:49 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3165056 00:05:27.103 09:28:49 -- event/cpu_locks.sh@153 -- # waitforlisten 3165056 /var/tmp/spdk2.sock 00:05:27.103 09:28:49 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:05:27.103 09:28:49 -- common/autotest_common.sh@829 -- # '[' -z 3165056 ']' 00:05:27.103 09:28:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:27.103 09:28:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:27.103 09:28:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:27.103 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:27.103 09:28:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:27.103 09:28:49 -- common/autotest_common.sh@10 -- # set +x 00:05:27.103 [2024-11-29 09:28:49.918837] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:27.103 [2024-11-29 09:28:49.918912] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165056 ] 00:05:27.361 EAL: No free 2048 kB hugepages reported on node 1 00:05:27.361 [2024-11-29 09:28:50.014600] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:27.361 [2024-11-29 09:28:50.014632] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:27.361 [2024-11-29 09:28:50.182677] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:27.361 [2024-11-29 09:28:50.182824] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:27.361 [2024-11-29 09:28:50.182938] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:27.361 [2024-11-29 09:28:50.182939] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:05:27.929 09:28:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:27.929 09:28:50 -- common/autotest_common.sh@862 -- # return 0 00:05:27.929 09:28:50 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:05:27.929 09:28:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:27.929 09:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:28.188 09:28:50 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:28.188 09:28:50 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:28.188 09:28:50 -- common/autotest_common.sh@650 -- # local es=0 00:05:28.188 09:28:50 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:28.188 09:28:50 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:05:28.188 09:28:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:28.188 09:28:50 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:05:28.188 09:28:50 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:28.188 09:28:50 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:05:28.188 09:28:50 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:28.188 09:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:28.188 [2024-11-29 09:28:50.783662] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3164831 has claimed it. 00:05:28.188 request: 00:05:28.188 { 00:05:28.188 "method": "framework_enable_cpumask_locks", 00:05:28.188 "req_id": 1 00:05:28.188 } 00:05:28.188 Got JSON-RPC error response 00:05:28.188 response: 00:05:28.188 { 00:05:28.188 "code": -32603, 00:05:28.188 "message": "Failed to claim CPU core: 2" 00:05:28.188 } 00:05:28.188 09:28:50 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:05:28.188 09:28:50 -- common/autotest_common.sh@653 -- # es=1 00:05:28.188 09:28:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:28.188 09:28:50 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:28.188 09:28:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:28.188 09:28:50 -- event/cpu_locks.sh@158 -- # waitforlisten 3164831 /var/tmp/spdk.sock 00:05:28.188 09:28:50 -- common/autotest_common.sh@829 -- # '[' -z 3164831 ']' 00:05:28.188 09:28:50 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:28.188 09:28:50 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:28.188 09:28:50 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:28.188 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:28.188 09:28:50 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:28.188 09:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:28.188 09:28:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:28.188 09:28:50 -- common/autotest_common.sh@862 -- # return 0 00:05:28.188 09:28:50 -- event/cpu_locks.sh@159 -- # waitforlisten 3165056 /var/tmp/spdk2.sock 00:05:28.188 09:28:50 -- common/autotest_common.sh@829 -- # '[' -z 3165056 ']' 00:05:28.188 09:28:50 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:28.188 09:28:50 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:28.188 09:28:50 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:28.188 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:28.188 09:28:50 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:28.188 09:28:50 -- common/autotest_common.sh@10 -- # set +x 00:05:28.447 09:28:51 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:28.447 09:28:51 -- common/autotest_common.sh@862 -- # return 0 00:05:28.447 09:28:51 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:05:28.447 09:28:51 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:05:28.447 09:28:51 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:05:28.447 09:28:51 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:05:28.447 00:05:28.447 real 0m2.160s 00:05:28.447 user 0m0.895s 00:05:28.447 sys 0m0.197s 00:05:28.447 09:28:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:28.447 09:28:51 -- common/autotest_common.sh@10 -- # set +x 00:05:28.447 ************************************ 00:05:28.447 END TEST locking_overlapped_coremask_via_rpc 00:05:28.447 ************************************ 00:05:28.447 09:28:51 -- event/cpu_locks.sh@174 -- # cleanup 00:05:28.447 09:28:51 -- event/cpu_locks.sh@15 -- # [[ -z 3164831 ]] 00:05:28.447 09:28:51 -- event/cpu_locks.sh@15 -- # killprocess 3164831 00:05:28.447 09:28:51 -- common/autotest_common.sh@936 -- # '[' -z 3164831 ']' 00:05:28.447 09:28:51 -- common/autotest_common.sh@940 -- # kill -0 3164831 00:05:28.447 09:28:51 -- common/autotest_common.sh@941 -- # uname 00:05:28.447 09:28:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:28.447 09:28:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3164831 00:05:28.447 09:28:51 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:28.447 09:28:51 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:28.447 09:28:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3164831' 00:05:28.447 killing process with pid 3164831 00:05:28.447 09:28:51 -- common/autotest_common.sh@955 -- # kill 3164831 00:05:28.447 09:28:51 -- common/autotest_common.sh@960 -- # wait 3164831 00:05:29.019 09:28:51 -- event/cpu_locks.sh@16 -- # [[ -z 3165056 ]] 00:05:29.019 09:28:51 -- event/cpu_locks.sh@16 -- # killprocess 3165056 00:05:29.019 09:28:51 -- common/autotest_common.sh@936 -- # '[' -z 3165056 ']' 00:05:29.019 09:28:51 -- common/autotest_common.sh@940 -- # kill -0 3165056 00:05:29.019 09:28:51 -- common/autotest_common.sh@941 -- # uname 00:05:29.019 09:28:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:29.019 09:28:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3165056 00:05:29.019 09:28:51 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:29.019 09:28:51 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:29.019 09:28:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3165056' 00:05:29.019 killing process with pid 3165056 00:05:29.019 09:28:51 -- common/autotest_common.sh@955 -- # kill 3165056 00:05:29.019 09:28:51 -- common/autotest_common.sh@960 -- # wait 3165056 00:05:29.278 09:28:51 -- event/cpu_locks.sh@18 -- # rm -f 00:05:29.278 09:28:51 -- event/cpu_locks.sh@1 -- # cleanup 00:05:29.278 09:28:51 -- event/cpu_locks.sh@15 -- # [[ -z 3164831 ]] 00:05:29.278 09:28:51 -- event/cpu_locks.sh@15 -- # killprocess 3164831 00:05:29.278 09:28:51 -- common/autotest_common.sh@936 -- # '[' -z 3164831 ']' 00:05:29.278 09:28:51 -- common/autotest_common.sh@940 -- # kill -0 3164831 00:05:29.278 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (3164831) - No such process 00:05:29.278 09:28:51 -- common/autotest_common.sh@963 -- # echo 'Process with pid 3164831 is not found' 00:05:29.278 Process with pid 3164831 is not found 00:05:29.278 09:28:51 -- event/cpu_locks.sh@16 -- # [[ -z 3165056 ]] 00:05:29.278 09:28:51 -- event/cpu_locks.sh@16 -- # killprocess 3165056 00:05:29.278 09:28:51 -- common/autotest_common.sh@936 -- # '[' -z 3165056 ']' 00:05:29.278 09:28:51 -- common/autotest_common.sh@940 -- # kill -0 3165056 00:05:29.278 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (3165056) - No such process 00:05:29.278 09:28:51 -- common/autotest_common.sh@963 -- # echo 'Process with pid 3165056 is not found' 00:05:29.278 Process with pid 3165056 is not found 00:05:29.278 09:28:51 -- event/cpu_locks.sh@18 -- # rm -f 00:05:29.278 00:05:29.278 real 0m17.632s 00:05:29.278 user 0m30.387s 00:05:29.278 sys 0m5.531s 00:05:29.278 09:28:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.278 09:28:51 -- common/autotest_common.sh@10 -- # set +x 00:05:29.278 ************************************ 00:05:29.278 END TEST cpu_locks 00:05:29.278 ************************************ 00:05:29.278 00:05:29.278 real 0m43.037s 00:05:29.278 user 1m21.465s 00:05:29.278 sys 0m9.587s 00:05:29.278 09:28:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.278 09:28:52 -- common/autotest_common.sh@10 -- # set +x 00:05:29.278 ************************************ 00:05:29.278 END TEST event 00:05:29.278 ************************************ 00:05:29.278 09:28:52 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:29.278 09:28:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:29.278 09:28:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:29.278 09:28:52 -- common/autotest_common.sh@10 -- # set +x 00:05:29.278 ************************************ 00:05:29.278 START TEST thread 00:05:29.278 ************************************ 00:05:29.278 09:28:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:05:29.537 * Looking for test storage... 00:05:29.537 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:05:29.537 09:28:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:29.537 09:28:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:29.537 09:28:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:29.537 09:28:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:29.537 09:28:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:29.537 09:28:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:29.537 09:28:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:29.537 09:28:52 -- scripts/common.sh@335 -- # IFS=.-: 00:05:29.537 09:28:52 -- scripts/common.sh@335 -- # read -ra ver1 00:05:29.537 09:28:52 -- scripts/common.sh@336 -- # IFS=.-: 00:05:29.537 09:28:52 -- scripts/common.sh@336 -- # read -ra ver2 00:05:29.537 09:28:52 -- scripts/common.sh@337 -- # local 'op=<' 00:05:29.537 09:28:52 -- scripts/common.sh@339 -- # ver1_l=2 00:05:29.537 09:28:52 -- scripts/common.sh@340 -- # ver2_l=1 00:05:29.537 09:28:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:29.537 09:28:52 -- scripts/common.sh@343 -- # case "$op" in 00:05:29.537 09:28:52 -- scripts/common.sh@344 -- # : 1 00:05:29.537 09:28:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:29.537 09:28:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:29.537 09:28:52 -- scripts/common.sh@364 -- # decimal 1 00:05:29.537 09:28:52 -- scripts/common.sh@352 -- # local d=1 00:05:29.537 09:28:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:29.537 09:28:52 -- scripts/common.sh@354 -- # echo 1 00:05:29.537 09:28:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:29.537 09:28:52 -- scripts/common.sh@365 -- # decimal 2 00:05:29.537 09:28:52 -- scripts/common.sh@352 -- # local d=2 00:05:29.537 09:28:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:29.537 09:28:52 -- scripts/common.sh@354 -- # echo 2 00:05:29.537 09:28:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:29.537 09:28:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:29.537 09:28:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:29.537 09:28:52 -- scripts/common.sh@367 -- # return 0 00:05:29.537 09:28:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:29.537 09:28:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:29.537 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.537 --rc genhtml_branch_coverage=1 00:05:29.538 --rc genhtml_function_coverage=1 00:05:29.538 --rc genhtml_legend=1 00:05:29.538 --rc geninfo_all_blocks=1 00:05:29.538 --rc geninfo_unexecuted_blocks=1 00:05:29.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.538 ' 00:05:29.538 09:28:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:29.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.538 --rc genhtml_branch_coverage=1 00:05:29.538 --rc genhtml_function_coverage=1 00:05:29.538 --rc genhtml_legend=1 00:05:29.538 --rc geninfo_all_blocks=1 00:05:29.538 --rc geninfo_unexecuted_blocks=1 00:05:29.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.538 ' 00:05:29.538 09:28:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:29.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.538 --rc genhtml_branch_coverage=1 00:05:29.538 --rc genhtml_function_coverage=1 00:05:29.538 --rc genhtml_legend=1 00:05:29.538 --rc geninfo_all_blocks=1 00:05:29.538 --rc geninfo_unexecuted_blocks=1 00:05:29.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.538 ' 00:05:29.538 09:28:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:29.538 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.538 --rc genhtml_branch_coverage=1 00:05:29.538 --rc genhtml_function_coverage=1 00:05:29.538 --rc genhtml_legend=1 00:05:29.538 --rc geninfo_all_blocks=1 00:05:29.538 --rc geninfo_unexecuted_blocks=1 00:05:29.538 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.538 ' 00:05:29.538 09:28:52 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:29.538 09:28:52 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:29.538 09:28:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:29.538 09:28:52 -- common/autotest_common.sh@10 -- # set +x 00:05:29.538 ************************************ 00:05:29.538 START TEST thread_poller_perf 00:05:29.538 ************************************ 00:05:29.538 09:28:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:05:29.538 [2024-11-29 09:28:52.266852] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:29.538 [2024-11-29 09:28:52.266917] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165593 ] 00:05:29.538 EAL: No free 2048 kB hugepages reported on node 1 00:05:29.538 [2024-11-29 09:28:52.331062] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:29.796 [2024-11-29 09:28:52.404497] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.796 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:05:30.732 [2024-11-29T08:28:53.574Z] ====================================== 00:05:30.732 [2024-11-29T08:28:53.574Z] busy:2504891238 (cyc) 00:05:30.732 [2024-11-29T08:28:53.574Z] total_run_count: 804000 00:05:30.732 [2024-11-29T08:28:53.574Z] tsc_hz: 2500000000 (cyc) 00:05:30.732 [2024-11-29T08:28:53.574Z] ====================================== 00:05:30.732 [2024-11-29T08:28:53.574Z] poller_cost: 3115 (cyc), 1246 (nsec) 00:05:30.732 00:05:30.732 real 0m1.211s 00:05:30.733 user 0m1.124s 00:05:30.733 sys 0m0.083s 00:05:30.733 09:28:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:30.733 09:28:53 -- common/autotest_common.sh@10 -- # set +x 00:05:30.733 ************************************ 00:05:30.733 END TEST thread_poller_perf 00:05:30.733 ************************************ 00:05:30.733 09:28:53 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:30.733 09:28:53 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:05:30.733 09:28:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:30.733 09:28:53 -- common/autotest_common.sh@10 -- # set +x 00:05:30.733 ************************************ 00:05:30.733 START TEST thread_poller_perf 00:05:30.733 ************************************ 00:05:30.733 09:28:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:05:30.733 [2024-11-29 09:28:53.535024] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:30.733 [2024-11-29 09:28:53.535110] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3165748 ] 00:05:30.733 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.992 [2024-11-29 09:28:53.604265] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:30.992 [2024-11-29 09:28:53.674198] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:30.992 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:05:31.931 [2024-11-29T08:28:54.773Z] ====================================== 00:05:31.931 [2024-11-29T08:28:54.773Z] busy:2502179994 (cyc) 00:05:31.931 [2024-11-29T08:28:54.773Z] total_run_count: 13537000 00:05:31.931 [2024-11-29T08:28:54.773Z] tsc_hz: 2500000000 (cyc) 00:05:31.931 [2024-11-29T08:28:54.773Z] ====================================== 00:05:31.931 [2024-11-29T08:28:54.773Z] poller_cost: 184 (cyc), 73 (nsec) 00:05:31.931 00:05:31.931 real 0m1.219s 00:05:31.931 user 0m1.125s 00:05:31.931 sys 0m0.089s 00:05:31.931 09:28:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:31.931 09:28:54 -- common/autotest_common.sh@10 -- # set +x 00:05:31.931 ************************************ 00:05:31.931 END TEST thread_poller_perf 00:05:31.931 ************************************ 00:05:32.191 09:28:54 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:05:32.191 09:28:54 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:32.191 09:28:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.191 09:28:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.191 09:28:54 -- common/autotest_common.sh@10 -- # set +x 00:05:32.191 ************************************ 00:05:32.191 START TEST thread_spdk_lock 00:05:32.191 ************************************ 00:05:32.191 09:28:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:05:32.191 [2024-11-29 09:28:54.802784] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:32.191 [2024-11-29 09:28:54.802878] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166007 ] 00:05:32.191 EAL: No free 2048 kB hugepages reported on node 1 00:05:32.191 [2024-11-29 09:28:54.874242] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:32.191 [2024-11-29 09:28:54.943697] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:32.191 [2024-11-29 09:28:54.943699] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.759 [2024-11-29 09:28:55.432682] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:32.759 [2024-11-29 09:28:55.432719] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:05:32.759 [2024-11-29 09:28:55.432729] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1483c80 00:05:32.759 [2024-11-29 09:28:55.433644] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:32.759 [2024-11-29 09:28:55.433747] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:32.759 [2024-11-29 09:28:55.433766] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:05:32.759 Starting test contend 00:05:32.759 Worker Delay Wait us Hold us Total us 00:05:32.759 0 3 170401 186580 356981 00:05:32.759 1 5 92550 285599 378149 00:05:32.759 PASS test contend 00:05:32.759 Starting test hold_by_poller 00:05:32.759 PASS test hold_by_poller 00:05:32.759 Starting test hold_by_message 00:05:32.759 PASS test hold_by_message 00:05:32.759 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:05:32.759 100014 assertions passed 00:05:32.759 0 assertions failed 00:05:32.759 00:05:32.759 real 0m0.708s 00:05:32.759 user 0m1.110s 00:05:32.759 sys 0m0.083s 00:05:32.759 09:28:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.759 09:28:55 -- common/autotest_common.sh@10 -- # set +x 00:05:32.759 ************************************ 00:05:32.759 END TEST thread_spdk_lock 00:05:32.759 ************************************ 00:05:32.759 00:05:32.759 real 0m3.472s 00:05:32.759 user 0m3.522s 00:05:32.759 sys 0m0.470s 00:05:32.759 09:28:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:32.759 09:28:55 -- common/autotest_common.sh@10 -- # set +x 00:05:32.759 ************************************ 00:05:32.759 END TEST thread 00:05:32.759 ************************************ 00:05:32.759 09:28:55 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:32.759 09:28:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:32.759 09:28:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:32.759 09:28:55 -- common/autotest_common.sh@10 -- # set +x 00:05:32.759 ************************************ 00:05:32.759 START TEST accel 00:05:32.759 ************************************ 00:05:32.759 09:28:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:05:33.019 * Looking for test storage... 00:05:33.019 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:05:33.019 09:28:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:33.019 09:28:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:33.019 09:28:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:33.019 09:28:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:33.019 09:28:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:33.019 09:28:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:33.019 09:28:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:33.019 09:28:55 -- scripts/common.sh@335 -- # IFS=.-: 00:05:33.019 09:28:55 -- scripts/common.sh@335 -- # read -ra ver1 00:05:33.019 09:28:55 -- scripts/common.sh@336 -- # IFS=.-: 00:05:33.019 09:28:55 -- scripts/common.sh@336 -- # read -ra ver2 00:05:33.019 09:28:55 -- scripts/common.sh@337 -- # local 'op=<' 00:05:33.019 09:28:55 -- scripts/common.sh@339 -- # ver1_l=2 00:05:33.019 09:28:55 -- scripts/common.sh@340 -- # ver2_l=1 00:05:33.019 09:28:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:33.019 09:28:55 -- scripts/common.sh@343 -- # case "$op" in 00:05:33.019 09:28:55 -- scripts/common.sh@344 -- # : 1 00:05:33.019 09:28:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:33.019 09:28:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:33.019 09:28:55 -- scripts/common.sh@364 -- # decimal 1 00:05:33.019 09:28:55 -- scripts/common.sh@352 -- # local d=1 00:05:33.019 09:28:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:33.019 09:28:55 -- scripts/common.sh@354 -- # echo 1 00:05:33.019 09:28:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:33.019 09:28:55 -- scripts/common.sh@365 -- # decimal 2 00:05:33.019 09:28:55 -- scripts/common.sh@352 -- # local d=2 00:05:33.019 09:28:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:33.019 09:28:55 -- scripts/common.sh@354 -- # echo 2 00:05:33.019 09:28:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:33.019 09:28:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:33.019 09:28:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:33.019 09:28:55 -- scripts/common.sh@367 -- # return 0 00:05:33.019 09:28:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:33.019 09:28:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:33.019 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.019 --rc genhtml_branch_coverage=1 00:05:33.019 --rc genhtml_function_coverage=1 00:05:33.019 --rc genhtml_legend=1 00:05:33.019 --rc geninfo_all_blocks=1 00:05:33.019 --rc geninfo_unexecuted_blocks=1 00:05:33.019 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.019 ' 00:05:33.019 09:28:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:33.019 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.019 --rc genhtml_branch_coverage=1 00:05:33.019 --rc genhtml_function_coverage=1 00:05:33.019 --rc genhtml_legend=1 00:05:33.019 --rc geninfo_all_blocks=1 00:05:33.019 --rc geninfo_unexecuted_blocks=1 00:05:33.019 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.019 ' 00:05:33.019 09:28:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:33.019 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.019 --rc genhtml_branch_coverage=1 00:05:33.019 --rc genhtml_function_coverage=1 00:05:33.019 --rc genhtml_legend=1 00:05:33.019 --rc geninfo_all_blocks=1 00:05:33.019 --rc geninfo_unexecuted_blocks=1 00:05:33.019 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.019 ' 00:05:33.019 09:28:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:33.019 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.019 --rc genhtml_branch_coverage=1 00:05:33.019 --rc genhtml_function_coverage=1 00:05:33.019 --rc genhtml_legend=1 00:05:33.019 --rc geninfo_all_blocks=1 00:05:33.019 --rc geninfo_unexecuted_blocks=1 00:05:33.019 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.019 ' 00:05:33.019 09:28:55 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:05:33.019 09:28:55 -- accel/accel.sh@74 -- # get_expected_opcs 00:05:33.019 09:28:55 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:33.019 09:28:55 -- accel/accel.sh@59 -- # spdk_tgt_pid=3166335 00:05:33.019 09:28:55 -- accel/accel.sh@60 -- # waitforlisten 3166335 00:05:33.019 09:28:55 -- common/autotest_common.sh@829 -- # '[' -z 3166335 ']' 00:05:33.019 09:28:55 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:33.019 09:28:55 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:05:33.019 09:28:55 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:33.019 09:28:55 -- accel/accel.sh@58 -- # build_accel_config 00:05:33.019 09:28:55 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:33.019 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:33.019 09:28:55 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:33.019 09:28:55 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:33.019 09:28:55 -- common/autotest_common.sh@10 -- # set +x 00:05:33.019 09:28:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:33.019 09:28:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:33.019 09:28:55 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:33.019 09:28:55 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:33.019 09:28:55 -- accel/accel.sh@41 -- # local IFS=, 00:05:33.019 09:28:55 -- accel/accel.sh@42 -- # jq -r . 00:05:33.019 [2024-11-29 09:28:55.791953] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:33.019 [2024-11-29 09:28:55.792035] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166335 ] 00:05:33.019 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.019 [2024-11-29 09:28:55.858958] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.279 [2024-11-29 09:28:55.930006] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:33.279 [2024-11-29 09:28:55.930114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.847 09:28:56 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:33.847 09:28:56 -- common/autotest_common.sh@862 -- # return 0 00:05:33.847 09:28:56 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:05:33.847 09:28:56 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:05:33.847 09:28:56 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:33.847 09:28:56 -- common/autotest_common.sh@10 -- # set +x 00:05:33.847 09:28:56 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:05:33.847 09:28:56 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # IFS== 00:05:33.847 09:28:56 -- accel/accel.sh@64 -- # read -r opc module 00:05:33.847 09:28:56 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:05:33.847 09:28:56 -- accel/accel.sh@67 -- # killprocess 3166335 00:05:33.847 09:28:56 -- common/autotest_common.sh@936 -- # '[' -z 3166335 ']' 00:05:33.847 09:28:56 -- common/autotest_common.sh@940 -- # kill -0 3166335 00:05:33.847 09:28:56 -- common/autotest_common.sh@941 -- # uname 00:05:33.847 09:28:56 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:33.847 09:28:56 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3166335 00:05:34.106 09:28:56 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:34.106 09:28:56 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:34.106 09:28:56 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3166335' 00:05:34.106 killing process with pid 3166335 00:05:34.106 09:28:56 -- common/autotest_common.sh@955 -- # kill 3166335 00:05:34.106 09:28:56 -- common/autotest_common.sh@960 -- # wait 3166335 00:05:34.365 09:28:57 -- accel/accel.sh@68 -- # trap - ERR 00:05:34.365 09:28:57 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:05:34.365 09:28:57 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:05:34.365 09:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.365 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:34.365 09:28:57 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:05:34.365 09:28:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:05:34.366 09:28:57 -- accel/accel.sh@12 -- # build_accel_config 00:05:34.366 09:28:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:34.366 09:28:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:34.366 09:28:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:34.366 09:28:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:34.366 09:28:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:34.366 09:28:57 -- accel/accel.sh@41 -- # local IFS=, 00:05:34.366 09:28:57 -- accel/accel.sh@42 -- # jq -r . 00:05:34.366 09:28:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.366 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:34.366 09:28:57 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:05:34.366 09:28:57 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:34.366 09:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.366 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:34.366 ************************************ 00:05:34.366 START TEST accel_missing_filename 00:05:34.366 ************************************ 00:05:34.366 09:28:57 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:05:34.366 09:28:57 -- common/autotest_common.sh@650 -- # local es=0 00:05:34.366 09:28:57 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:05:34.366 09:28:57 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:34.366 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:34.366 09:28:57 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:34.366 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:34.366 09:28:57 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:05:34.366 09:28:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:05:34.366 09:28:57 -- accel/accel.sh@12 -- # build_accel_config 00:05:34.366 09:28:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:34.366 09:28:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:34.366 09:28:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:34.366 09:28:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:34.366 09:28:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:34.366 09:28:57 -- accel/accel.sh@41 -- # local IFS=, 00:05:34.366 09:28:57 -- accel/accel.sh@42 -- # jq -r . 00:05:34.366 [2024-11-29 09:28:57.116570] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:34.366 [2024-11-29 09:28:57.116696] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166642 ] 00:05:34.366 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.366 [2024-11-29 09:28:57.187085] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.625 [2024-11-29 09:28:57.258027] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.625 [2024-11-29 09:28:57.297484] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:34.625 [2024-11-29 09:28:57.357448] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:34.625 A filename is required. 00:05:34.625 09:28:57 -- common/autotest_common.sh@653 -- # es=234 00:05:34.625 09:28:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:34.625 09:28:57 -- common/autotest_common.sh@662 -- # es=106 00:05:34.625 09:28:57 -- common/autotest_common.sh@663 -- # case "$es" in 00:05:34.625 09:28:57 -- common/autotest_common.sh@670 -- # es=1 00:05:34.625 09:28:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:34.625 00:05:34.625 real 0m0.333s 00:05:34.625 user 0m0.234s 00:05:34.625 sys 0m0.138s 00:05:34.625 09:28:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.625 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:34.626 ************************************ 00:05:34.626 END TEST accel_missing_filename 00:05:34.626 ************************************ 00:05:34.626 09:28:57 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:34.626 09:28:57 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:05:34.626 09:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.626 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:34.885 ************************************ 00:05:34.885 START TEST accel_compress_verify 00:05:34.885 ************************************ 00:05:34.885 09:28:57 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:34.885 09:28:57 -- common/autotest_common.sh@650 -- # local es=0 00:05:34.885 09:28:57 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:34.885 09:28:57 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:34.885 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:34.885 09:28:57 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:34.885 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:34.885 09:28:57 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:34.885 09:28:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:05:34.885 09:28:57 -- accel/accel.sh@12 -- # build_accel_config 00:05:34.885 09:28:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:34.885 09:28:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:34.885 09:28:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:34.885 09:28:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:34.885 09:28:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:34.885 09:28:57 -- accel/accel.sh@41 -- # local IFS=, 00:05:34.885 09:28:57 -- accel/accel.sh@42 -- # jq -r . 00:05:34.885 [2024-11-29 09:28:57.497396] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:34.885 [2024-11-29 09:28:57.497491] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166661 ] 00:05:34.885 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.885 [2024-11-29 09:28:57.567186] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:34.885 [2024-11-29 09:28:57.636628] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.885 [2024-11-29 09:28:57.676421] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:35.145 [2024-11-29 09:28:57.736454] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:05:35.145 00:05:35.145 Compression does not support the verify option, aborting. 00:05:35.145 09:28:57 -- common/autotest_common.sh@653 -- # es=161 00:05:35.145 09:28:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:35.145 09:28:57 -- common/autotest_common.sh@662 -- # es=33 00:05:35.145 09:28:57 -- common/autotest_common.sh@663 -- # case "$es" in 00:05:35.145 09:28:57 -- common/autotest_common.sh@670 -- # es=1 00:05:35.145 09:28:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:35.145 00:05:35.145 real 0m0.328s 00:05:35.145 user 0m0.230s 00:05:35.145 sys 0m0.134s 00:05:35.145 09:28:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.145 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.145 ************************************ 00:05:35.145 END TEST accel_compress_verify 00:05:35.145 ************************************ 00:05:35.145 09:28:57 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:05:35.145 09:28:57 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:35.145 09:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.145 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.145 ************************************ 00:05:35.145 START TEST accel_wrong_workload 00:05:35.145 ************************************ 00:05:35.145 09:28:57 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:05:35.145 09:28:57 -- common/autotest_common.sh@650 -- # local es=0 00:05:35.145 09:28:57 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:05:35.145 09:28:57 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:35.145 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:35.145 09:28:57 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:35.145 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:35.145 09:28:57 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:05:35.145 09:28:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:05:35.145 09:28:57 -- accel/accel.sh@12 -- # build_accel_config 00:05:35.145 09:28:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:35.145 09:28:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:35.145 09:28:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:35.145 09:28:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:35.145 09:28:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:35.145 09:28:57 -- accel/accel.sh@41 -- # local IFS=, 00:05:35.145 09:28:57 -- accel/accel.sh@42 -- # jq -r . 00:05:35.145 Unsupported workload type: foobar 00:05:35.145 [2024-11-29 09:28:57.872506] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:05:35.145 accel_perf options: 00:05:35.145 [-h help message] 00:05:35.145 [-q queue depth per core] 00:05:35.145 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:35.145 [-T number of threads per core 00:05:35.145 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:35.145 [-t time in seconds] 00:05:35.145 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:35.145 [ dif_verify, , dif_generate, dif_generate_copy 00:05:35.145 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:35.145 [-l for compress/decompress workloads, name of uncompressed input file 00:05:35.145 [-S for crc32c workload, use this seed value (default 0) 00:05:35.145 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:35.145 [-f for fill workload, use this BYTE value (default 255) 00:05:35.145 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:35.145 [-y verify result if this switch is on] 00:05:35.145 [-a tasks to allocate per core (default: same value as -q)] 00:05:35.145 Can be used to spread operations across a wider range of memory. 00:05:35.145 09:28:57 -- common/autotest_common.sh@653 -- # es=1 00:05:35.145 09:28:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:35.145 09:28:57 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:35.145 09:28:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:35.145 00:05:35.145 real 0m0.028s 00:05:35.145 user 0m0.010s 00:05:35.145 sys 0m0.018s 00:05:35.145 09:28:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.145 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.145 ************************************ 00:05:35.145 END TEST accel_wrong_workload 00:05:35.145 ************************************ 00:05:35.145 Error: writing output failed: Broken pipe 00:05:35.145 09:28:57 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:05:35.145 09:28:57 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:05:35.145 09:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.145 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.145 ************************************ 00:05:35.145 START TEST accel_negative_buffers 00:05:35.145 ************************************ 00:05:35.145 09:28:57 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:05:35.145 09:28:57 -- common/autotest_common.sh@650 -- # local es=0 00:05:35.145 09:28:57 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:05:35.145 09:28:57 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:05:35.145 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:35.145 09:28:57 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:05:35.145 09:28:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:05:35.145 09:28:57 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:05:35.145 09:28:57 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:05:35.145 09:28:57 -- accel/accel.sh@12 -- # build_accel_config 00:05:35.145 09:28:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:35.145 09:28:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:35.145 09:28:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:35.145 09:28:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:35.145 09:28:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:35.145 09:28:57 -- accel/accel.sh@41 -- # local IFS=, 00:05:35.145 09:28:57 -- accel/accel.sh@42 -- # jq -r . 00:05:35.145 -x option must be non-negative. 00:05:35.145 [2024-11-29 09:28:57.950005] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:05:35.145 accel_perf options: 00:05:35.145 [-h help message] 00:05:35.145 [-q queue depth per core] 00:05:35.145 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:05:35.145 [-T number of threads per core 00:05:35.145 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:05:35.145 [-t time in seconds] 00:05:35.145 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:05:35.145 [ dif_verify, , dif_generate, dif_generate_copy 00:05:35.145 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:05:35.145 [-l for compress/decompress workloads, name of uncompressed input file 00:05:35.145 [-S for crc32c workload, use this seed value (default 0) 00:05:35.146 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:05:35.146 [-f for fill workload, use this BYTE value (default 255) 00:05:35.146 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:05:35.146 [-y verify result if this switch is on] 00:05:35.146 [-a tasks to allocate per core (default: same value as -q)] 00:05:35.146 Can be used to spread operations across a wider range of memory. 00:05:35.146 09:28:57 -- common/autotest_common.sh@653 -- # es=1 00:05:35.146 09:28:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:05:35.146 09:28:57 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:05:35.146 09:28:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:05:35.146 00:05:35.146 real 0m0.030s 00:05:35.146 user 0m0.013s 00:05:35.146 sys 0m0.017s 00:05:35.146 09:28:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.146 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.146 ************************************ 00:05:35.146 END TEST accel_negative_buffers 00:05:35.146 ************************************ 00:05:35.146 Error: writing output failed: Broken pipe 00:05:35.405 09:28:57 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:05:35.405 09:28:57 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:35.405 09:28:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.405 09:28:57 -- common/autotest_common.sh@10 -- # set +x 00:05:35.405 ************************************ 00:05:35.405 START TEST accel_crc32c 00:05:35.405 ************************************ 00:05:35.405 09:28:57 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:05:35.405 09:28:58 -- accel/accel.sh@16 -- # local accel_opc 00:05:35.405 09:28:58 -- accel/accel.sh@17 -- # local accel_module 00:05:35.405 09:28:58 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:35.405 09:28:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:35.405 09:28:58 -- accel/accel.sh@12 -- # build_accel_config 00:05:35.405 09:28:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:35.405 09:28:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:35.405 09:28:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:35.405 09:28:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:35.405 09:28:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:35.405 09:28:58 -- accel/accel.sh@41 -- # local IFS=, 00:05:35.405 09:28:58 -- accel/accel.sh@42 -- # jq -r . 00:05:35.405 [2024-11-29 09:28:58.022505] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:35.405 [2024-11-29 09:28:58.022595] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166729 ] 00:05:35.405 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.405 [2024-11-29 09:28:58.094890] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.405 [2024-11-29 09:28:58.165674] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.784 09:28:59 -- accel/accel.sh@18 -- # out=' 00:05:36.784 SPDK Configuration: 00:05:36.784 Core mask: 0x1 00:05:36.784 00:05:36.784 Accel Perf Configuration: 00:05:36.784 Workload Type: crc32c 00:05:36.784 CRC-32C seed: 32 00:05:36.784 Transfer size: 4096 bytes 00:05:36.784 Vector count 1 00:05:36.784 Module: software 00:05:36.784 Queue depth: 32 00:05:36.784 Allocate depth: 32 00:05:36.784 # threads/core: 1 00:05:36.784 Run time: 1 seconds 00:05:36.784 Verify: Yes 00:05:36.784 00:05:36.784 Running for 1 seconds... 00:05:36.784 00:05:36.784 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:36.784 ------------------------------------------------------------------------------------ 00:05:36.784 0,0 844288/s 3298 MiB/s 0 0 00:05:36.784 ==================================================================================== 00:05:36.784 Total 844288/s 3298 MiB/s 0 0' 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:05:36.784 09:28:59 -- accel/accel.sh@12 -- # build_accel_config 00:05:36.784 09:28:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:05:36.784 09:28:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:36.784 09:28:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:36.784 09:28:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:36.784 09:28:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:36.784 09:28:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:36.784 09:28:59 -- accel/accel.sh@41 -- # local IFS=, 00:05:36.784 09:28:59 -- accel/accel.sh@42 -- # jq -r . 00:05:36.784 [2024-11-29 09:28:59.354809] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:36.784 [2024-11-29 09:28:59.354907] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3166995 ] 00:05:36.784 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.784 [2024-11-29 09:28:59.425706] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.784 [2024-11-29 09:28:59.494164] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=0x1 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=crc32c 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=32 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=software 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@23 -- # accel_module=software 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=32 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=32 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=1 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val=Yes 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:36.784 09:28:59 -- accel/accel.sh@21 -- # val= 00:05:36.784 09:28:59 -- accel/accel.sh@22 -- # case "$var" in 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # IFS=: 00:05:36.784 09:28:59 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@21 -- # val= 00:05:38.163 09:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # IFS=: 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@21 -- # val= 00:05:38.163 09:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # IFS=: 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@21 -- # val= 00:05:38.163 09:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # IFS=: 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@21 -- # val= 00:05:38.163 09:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # IFS=: 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@21 -- # val= 00:05:38.163 09:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # IFS=: 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@21 -- # val= 00:05:38.163 09:29:00 -- accel/accel.sh@22 -- # case "$var" in 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # IFS=: 00:05:38.163 09:29:00 -- accel/accel.sh@20 -- # read -r var val 00:05:38.163 09:29:00 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:38.163 09:29:00 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:38.163 09:29:00 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:38.163 00:05:38.163 real 0m2.667s 00:05:38.163 user 0m2.422s 00:05:38.163 sys 0m0.251s 00:05:38.163 09:29:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.163 09:29:00 -- common/autotest_common.sh@10 -- # set +x 00:05:38.163 ************************************ 00:05:38.163 END TEST accel_crc32c 00:05:38.163 ************************************ 00:05:38.163 09:29:00 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:05:38.163 09:29:00 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:38.163 09:29:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.163 09:29:00 -- common/autotest_common.sh@10 -- # set +x 00:05:38.163 ************************************ 00:05:38.163 START TEST accel_crc32c_C2 00:05:38.163 ************************************ 00:05:38.163 09:29:00 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:05:38.163 09:29:00 -- accel/accel.sh@16 -- # local accel_opc 00:05:38.163 09:29:00 -- accel/accel.sh@17 -- # local accel_module 00:05:38.163 09:29:00 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:38.163 09:29:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:38.163 09:29:00 -- accel/accel.sh@12 -- # build_accel_config 00:05:38.163 09:29:00 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:38.163 09:29:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:38.163 09:29:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:38.163 09:29:00 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:38.163 09:29:00 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:38.163 09:29:00 -- accel/accel.sh@41 -- # local IFS=, 00:05:38.163 09:29:00 -- accel/accel.sh@42 -- # jq -r . 00:05:38.163 [2024-11-29 09:29:00.736215] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:38.163 [2024-11-29 09:29:00.736305] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3167278 ] 00:05:38.163 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.163 [2024-11-29 09:29:00.805297] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:38.163 [2024-11-29 09:29:00.873761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.542 09:29:02 -- accel/accel.sh@18 -- # out=' 00:05:39.542 SPDK Configuration: 00:05:39.542 Core mask: 0x1 00:05:39.542 00:05:39.542 Accel Perf Configuration: 00:05:39.542 Workload Type: crc32c 00:05:39.542 CRC-32C seed: 0 00:05:39.542 Transfer size: 4096 bytes 00:05:39.542 Vector count 2 00:05:39.542 Module: software 00:05:39.542 Queue depth: 32 00:05:39.542 Allocate depth: 32 00:05:39.542 # threads/core: 1 00:05:39.543 Run time: 1 seconds 00:05:39.543 Verify: Yes 00:05:39.543 00:05:39.543 Running for 1 seconds... 00:05:39.543 00:05:39.543 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:39.543 ------------------------------------------------------------------------------------ 00:05:39.543 0,0 612704/s 4786 MiB/s 0 0 00:05:39.543 ==================================================================================== 00:05:39.543 Total 612704/s 2393 MiB/s 0 0' 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:05:39.543 09:29:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:05:39.543 09:29:02 -- accel/accel.sh@12 -- # build_accel_config 00:05:39.543 09:29:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:39.543 09:29:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:39.543 09:29:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:39.543 09:29:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:39.543 09:29:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:39.543 09:29:02 -- accel/accel.sh@41 -- # local IFS=, 00:05:39.543 09:29:02 -- accel/accel.sh@42 -- # jq -r . 00:05:39.543 [2024-11-29 09:29:02.060499] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:39.543 [2024-11-29 09:29:02.060593] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3167545 ] 00:05:39.543 EAL: No free 2048 kB hugepages reported on node 1 00:05:39.543 [2024-11-29 09:29:02.130130] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:39.543 [2024-11-29 09:29:02.198256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=0x1 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=crc32c 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=0 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=software 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@23 -- # accel_module=software 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=32 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=32 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=1 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val=Yes 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:39.543 09:29:02 -- accel/accel.sh@21 -- # val= 00:05:39.543 09:29:02 -- accel/accel.sh@22 -- # case "$var" in 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # IFS=: 00:05:39.543 09:29:02 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@21 -- # val= 00:05:40.924 09:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # IFS=: 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@21 -- # val= 00:05:40.924 09:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # IFS=: 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@21 -- # val= 00:05:40.924 09:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # IFS=: 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@21 -- # val= 00:05:40.924 09:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # IFS=: 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@21 -- # val= 00:05:40.924 09:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # IFS=: 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@21 -- # val= 00:05:40.924 09:29:03 -- accel/accel.sh@22 -- # case "$var" in 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # IFS=: 00:05:40.924 09:29:03 -- accel/accel.sh@20 -- # read -r var val 00:05:40.924 09:29:03 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:40.924 09:29:03 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:05:40.924 09:29:03 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:40.924 00:05:40.924 real 0m2.656s 00:05:40.924 user 0m2.406s 00:05:40.924 sys 0m0.259s 00:05:40.924 09:29:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:40.924 09:29:03 -- common/autotest_common.sh@10 -- # set +x 00:05:40.924 ************************************ 00:05:40.924 END TEST accel_crc32c_C2 00:05:40.924 ************************************ 00:05:40.924 09:29:03 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:05:40.924 09:29:03 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:40.924 09:29:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:40.924 09:29:03 -- common/autotest_common.sh@10 -- # set +x 00:05:40.924 ************************************ 00:05:40.924 START TEST accel_copy 00:05:40.924 ************************************ 00:05:40.924 09:29:03 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:05:40.924 09:29:03 -- accel/accel.sh@16 -- # local accel_opc 00:05:40.924 09:29:03 -- accel/accel.sh@17 -- # local accel_module 00:05:40.924 09:29:03 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:05:40.924 09:29:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:40.924 09:29:03 -- accel/accel.sh@12 -- # build_accel_config 00:05:40.924 09:29:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:40.924 09:29:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:40.924 09:29:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:40.924 09:29:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:40.924 09:29:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:40.924 09:29:03 -- accel/accel.sh@41 -- # local IFS=, 00:05:40.924 09:29:03 -- accel/accel.sh@42 -- # jq -r . 00:05:40.924 [2024-11-29 09:29:03.442864] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:40.924 [2024-11-29 09:29:03.442971] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3167834 ] 00:05:40.924 EAL: No free 2048 kB hugepages reported on node 1 00:05:40.924 [2024-11-29 09:29:03.512646] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:40.924 [2024-11-29 09:29:03.582009] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.304 09:29:04 -- accel/accel.sh@18 -- # out=' 00:05:42.304 SPDK Configuration: 00:05:42.304 Core mask: 0x1 00:05:42.304 00:05:42.304 Accel Perf Configuration: 00:05:42.304 Workload Type: copy 00:05:42.304 Transfer size: 4096 bytes 00:05:42.304 Vector count 1 00:05:42.304 Module: software 00:05:42.304 Queue depth: 32 00:05:42.304 Allocate depth: 32 00:05:42.304 # threads/core: 1 00:05:42.304 Run time: 1 seconds 00:05:42.304 Verify: Yes 00:05:42.304 00:05:42.304 Running for 1 seconds... 00:05:42.304 00:05:42.304 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:42.304 ------------------------------------------------------------------------------------ 00:05:42.304 0,0 543936/s 2124 MiB/s 0 0 00:05:42.304 ==================================================================================== 00:05:42.304 Total 543936/s 2124 MiB/s 0 0' 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:05:42.304 09:29:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:05:42.304 09:29:04 -- accel/accel.sh@12 -- # build_accel_config 00:05:42.304 09:29:04 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:42.304 09:29:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:42.304 09:29:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:42.304 09:29:04 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:42.304 09:29:04 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:42.304 09:29:04 -- accel/accel.sh@41 -- # local IFS=, 00:05:42.304 09:29:04 -- accel/accel.sh@42 -- # jq -r . 00:05:42.304 [2024-11-29 09:29:04.771770] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:42.304 [2024-11-29 09:29:04.771863] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168101 ] 00:05:42.304 EAL: No free 2048 kB hugepages reported on node 1 00:05:42.304 [2024-11-29 09:29:04.842589] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:42.304 [2024-11-29 09:29:04.911523] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=0x1 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=copy 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@24 -- # accel_opc=copy 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=software 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@23 -- # accel_module=software 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=32 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=32 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=1 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val=Yes 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:42.304 09:29:04 -- accel/accel.sh@21 -- # val= 00:05:42.304 09:29:04 -- accel/accel.sh@22 -- # case "$var" in 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # IFS=: 00:05:42.304 09:29:04 -- accel/accel.sh@20 -- # read -r var val 00:05:43.240 09:29:06 -- accel/accel.sh@21 -- # val= 00:05:43.240 09:29:06 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.240 09:29:06 -- accel/accel.sh@20 -- # IFS=: 00:05:43.240 09:29:06 -- accel/accel.sh@20 -- # read -r var val 00:05:43.240 09:29:06 -- accel/accel.sh@21 -- # val= 00:05:43.240 09:29:06 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.240 09:29:06 -- accel/accel.sh@20 -- # IFS=: 00:05:43.240 09:29:06 -- accel/accel.sh@20 -- # read -r var val 00:05:43.240 09:29:06 -- accel/accel.sh@21 -- # val= 00:05:43.240 09:29:06 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.240 09:29:06 -- accel/accel.sh@20 -- # IFS=: 00:05:43.499 09:29:06 -- accel/accel.sh@20 -- # read -r var val 00:05:43.499 09:29:06 -- accel/accel.sh@21 -- # val= 00:05:43.499 09:29:06 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.499 09:29:06 -- accel/accel.sh@20 -- # IFS=: 00:05:43.499 09:29:06 -- accel/accel.sh@20 -- # read -r var val 00:05:43.499 09:29:06 -- accel/accel.sh@21 -- # val= 00:05:43.500 09:29:06 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.500 09:29:06 -- accel/accel.sh@20 -- # IFS=: 00:05:43.500 09:29:06 -- accel/accel.sh@20 -- # read -r var val 00:05:43.500 09:29:06 -- accel/accel.sh@21 -- # val= 00:05:43.500 09:29:06 -- accel/accel.sh@22 -- # case "$var" in 00:05:43.500 09:29:06 -- accel/accel.sh@20 -- # IFS=: 00:05:43.500 09:29:06 -- accel/accel.sh@20 -- # read -r var val 00:05:43.500 09:29:06 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:43.500 09:29:06 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:05:43.500 09:29:06 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:43.500 00:05:43.500 real 0m2.666s 00:05:43.500 user 0m2.420s 00:05:43.500 sys 0m0.254s 00:05:43.500 09:29:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.500 09:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:43.500 ************************************ 00:05:43.500 END TEST accel_copy 00:05:43.500 ************************************ 00:05:43.500 09:29:06 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.500 09:29:06 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:05:43.500 09:29:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.500 09:29:06 -- common/autotest_common.sh@10 -- # set +x 00:05:43.500 ************************************ 00:05:43.500 START TEST accel_fill 00:05:43.500 ************************************ 00:05:43.500 09:29:06 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.500 09:29:06 -- accel/accel.sh@16 -- # local accel_opc 00:05:43.500 09:29:06 -- accel/accel.sh@17 -- # local accel_module 00:05:43.500 09:29:06 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.500 09:29:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:43.500 09:29:06 -- accel/accel.sh@12 -- # build_accel_config 00:05:43.500 09:29:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:43.500 09:29:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:43.500 09:29:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:43.500 09:29:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:43.500 09:29:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:43.500 09:29:06 -- accel/accel.sh@41 -- # local IFS=, 00:05:43.500 09:29:06 -- accel/accel.sh@42 -- # jq -r . 00:05:43.500 [2024-11-29 09:29:06.157335] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:43.500 [2024-11-29 09:29:06.157424] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168376 ] 00:05:43.500 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.500 [2024-11-29 09:29:06.229014] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:43.500 [2024-11-29 09:29:06.298618] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.880 09:29:07 -- accel/accel.sh@18 -- # out=' 00:05:44.880 SPDK Configuration: 00:05:44.880 Core mask: 0x1 00:05:44.880 00:05:44.880 Accel Perf Configuration: 00:05:44.880 Workload Type: fill 00:05:44.880 Fill pattern: 0x80 00:05:44.880 Transfer size: 4096 bytes 00:05:44.880 Vector count 1 00:05:44.880 Module: software 00:05:44.880 Queue depth: 64 00:05:44.880 Allocate depth: 64 00:05:44.880 # threads/core: 1 00:05:44.880 Run time: 1 seconds 00:05:44.880 Verify: Yes 00:05:44.880 00:05:44.880 Running for 1 seconds... 00:05:44.880 00:05:44.880 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:44.880 ------------------------------------------------------------------------------------ 00:05:44.880 0,0 970112/s 3789 MiB/s 0 0 00:05:44.880 ==================================================================================== 00:05:44.880 Total 970112/s 3789 MiB/s 0 0' 00:05:44.880 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.880 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.880 09:29:07 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:44.880 09:29:07 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:05:44.880 09:29:07 -- accel/accel.sh@12 -- # build_accel_config 00:05:44.880 09:29:07 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:44.880 09:29:07 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:44.880 09:29:07 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:44.880 09:29:07 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:44.881 09:29:07 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:44.881 09:29:07 -- accel/accel.sh@41 -- # local IFS=, 00:05:44.881 09:29:07 -- accel/accel.sh@42 -- # jq -r . 00:05:44.881 [2024-11-29 09:29:07.490095] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:44.881 [2024-11-29 09:29:07.490187] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168546 ] 00:05:44.881 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.881 [2024-11-29 09:29:07.562257] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:44.881 [2024-11-29 09:29:07.631240] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=0x1 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=fill 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@24 -- # accel_opc=fill 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=0x80 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=software 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@23 -- # accel_module=software 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=64 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=64 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=1 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val=Yes 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:44.881 09:29:07 -- accel/accel.sh@21 -- # val= 00:05:44.881 09:29:07 -- accel/accel.sh@22 -- # case "$var" in 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # IFS=: 00:05:44.881 09:29:07 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@21 -- # val= 00:05:46.456 09:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@21 -- # val= 00:05:46.456 09:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@21 -- # val= 00:05:46.456 09:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@21 -- # val= 00:05:46.456 09:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@21 -- # val= 00:05:46.456 09:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@21 -- # val= 00:05:46.456 09:29:08 -- accel/accel.sh@22 -- # case "$var" in 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # IFS=: 00:05:46.456 09:29:08 -- accel/accel.sh@20 -- # read -r var val 00:05:46.456 09:29:08 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:46.456 09:29:08 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:05:46.456 09:29:08 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:46.456 00:05:46.456 real 0m2.671s 00:05:46.456 user 0m2.423s 00:05:46.456 sys 0m0.258s 00:05:46.456 09:29:08 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:46.456 09:29:08 -- common/autotest_common.sh@10 -- # set +x 00:05:46.456 ************************************ 00:05:46.456 END TEST accel_fill 00:05:46.456 ************************************ 00:05:46.456 09:29:08 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:05:46.456 09:29:08 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:46.456 09:29:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:46.456 09:29:08 -- common/autotest_common.sh@10 -- # set +x 00:05:46.456 ************************************ 00:05:46.456 START TEST accel_copy_crc32c 00:05:46.456 ************************************ 00:05:46.456 09:29:08 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:05:46.456 09:29:08 -- accel/accel.sh@16 -- # local accel_opc 00:05:46.456 09:29:08 -- accel/accel.sh@17 -- # local accel_module 00:05:46.456 09:29:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:46.456 09:29:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:46.456 09:29:08 -- accel/accel.sh@12 -- # build_accel_config 00:05:46.456 09:29:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:46.456 09:29:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:46.456 09:29:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:46.456 09:29:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:46.456 09:29:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:46.456 09:29:08 -- accel/accel.sh@41 -- # local IFS=, 00:05:46.456 09:29:08 -- accel/accel.sh@42 -- # jq -r . 00:05:46.456 [2024-11-29 09:29:08.866873] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:46.456 [2024-11-29 09:29:08.866939] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168751 ] 00:05:46.456 EAL: No free 2048 kB hugepages reported on node 1 00:05:46.456 [2024-11-29 09:29:08.928879] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:46.456 [2024-11-29 09:29:08.999457] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.395 09:29:10 -- accel/accel.sh@18 -- # out=' 00:05:47.395 SPDK Configuration: 00:05:47.395 Core mask: 0x1 00:05:47.395 00:05:47.395 Accel Perf Configuration: 00:05:47.395 Workload Type: copy_crc32c 00:05:47.395 CRC-32C seed: 0 00:05:47.395 Vector size: 4096 bytes 00:05:47.395 Transfer size: 4096 bytes 00:05:47.395 Vector count 1 00:05:47.395 Module: software 00:05:47.395 Queue depth: 32 00:05:47.395 Allocate depth: 32 00:05:47.395 # threads/core: 1 00:05:47.395 Run time: 1 seconds 00:05:47.395 Verify: Yes 00:05:47.395 00:05:47.395 Running for 1 seconds... 00:05:47.395 00:05:47.395 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:47.395 ------------------------------------------------------------------------------------ 00:05:47.395 0,0 433408/s 1693 MiB/s 0 0 00:05:47.395 ==================================================================================== 00:05:47.395 Total 433408/s 1693 MiB/s 0 0' 00:05:47.395 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.395 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.395 09:29:10 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:05:47.395 09:29:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:05:47.395 09:29:10 -- accel/accel.sh@12 -- # build_accel_config 00:05:47.395 09:29:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:47.395 09:29:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:47.395 09:29:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:47.395 09:29:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:47.395 09:29:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:47.395 09:29:10 -- accel/accel.sh@41 -- # local IFS=, 00:05:47.395 09:29:10 -- accel/accel.sh@42 -- # jq -r . 00:05:47.395 [2024-11-29 09:29:10.192407] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:47.395 [2024-11-29 09:29:10.192500] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3168969 ] 00:05:47.395 EAL: No free 2048 kB hugepages reported on node 1 00:05:47.654 [2024-11-29 09:29:10.264695] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:47.654 [2024-11-29 09:29:10.336065] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:47.654 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.654 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.654 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.654 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.654 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.654 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.654 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.654 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.654 09:29:10 -- accel/accel.sh@21 -- # val=0x1 00:05:47.654 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=copy_crc32c 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=0 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=software 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@23 -- # accel_module=software 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=32 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=32 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=1 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val=Yes 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:47.655 09:29:10 -- accel/accel.sh@21 -- # val= 00:05:47.655 09:29:10 -- accel/accel.sh@22 -- # case "$var" in 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # IFS=: 00:05:47.655 09:29:10 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@21 -- # val= 00:05:49.031 09:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@21 -- # val= 00:05:49.031 09:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@21 -- # val= 00:05:49.031 09:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@21 -- # val= 00:05:49.031 09:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@21 -- # val= 00:05:49.031 09:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@21 -- # val= 00:05:49.031 09:29:11 -- accel/accel.sh@22 -- # case "$var" in 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # IFS=: 00:05:49.031 09:29:11 -- accel/accel.sh@20 -- # read -r var val 00:05:49.031 09:29:11 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:49.031 09:29:11 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:05:49.031 09:29:11 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:49.031 00:05:49.031 real 0m2.656s 00:05:49.031 user 0m2.415s 00:05:49.031 sys 0m0.252s 00:05:49.031 09:29:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:49.031 09:29:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.031 ************************************ 00:05:49.031 END TEST accel_copy_crc32c 00:05:49.031 ************************************ 00:05:49.031 09:29:11 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:05:49.031 09:29:11 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:49.031 09:29:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:49.031 09:29:11 -- common/autotest_common.sh@10 -- # set +x 00:05:49.031 ************************************ 00:05:49.031 START TEST accel_copy_crc32c_C2 00:05:49.031 ************************************ 00:05:49.031 09:29:11 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:05:49.031 09:29:11 -- accel/accel.sh@16 -- # local accel_opc 00:05:49.031 09:29:11 -- accel/accel.sh@17 -- # local accel_module 00:05:49.031 09:29:11 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:49.031 09:29:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:49.031 09:29:11 -- accel/accel.sh@12 -- # build_accel_config 00:05:49.031 09:29:11 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:49.031 09:29:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:49.031 09:29:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:49.031 09:29:11 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:49.031 09:29:11 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:49.031 09:29:11 -- accel/accel.sh@41 -- # local IFS=, 00:05:49.031 09:29:11 -- accel/accel.sh@42 -- # jq -r . 00:05:49.031 [2024-11-29 09:29:11.580827] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:49.031 [2024-11-29 09:29:11.580917] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3169256 ] 00:05:49.031 EAL: No free 2048 kB hugepages reported on node 1 00:05:49.031 [2024-11-29 09:29:11.645639] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:49.031 [2024-11-29 09:29:11.715933] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.407 09:29:12 -- accel/accel.sh@18 -- # out=' 00:05:50.407 SPDK Configuration: 00:05:50.407 Core mask: 0x1 00:05:50.407 00:05:50.407 Accel Perf Configuration: 00:05:50.407 Workload Type: copy_crc32c 00:05:50.407 CRC-32C seed: 0 00:05:50.407 Vector size: 4096 bytes 00:05:50.407 Transfer size: 8192 bytes 00:05:50.407 Vector count 2 00:05:50.407 Module: software 00:05:50.407 Queue depth: 32 00:05:50.407 Allocate depth: 32 00:05:50.407 # threads/core: 1 00:05:50.407 Run time: 1 seconds 00:05:50.407 Verify: Yes 00:05:50.407 00:05:50.407 Running for 1 seconds... 00:05:50.407 00:05:50.407 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:50.407 ------------------------------------------------------------------------------------ 00:05:50.407 0,0 300448/s 2347 MiB/s 0 0 00:05:50.407 ==================================================================================== 00:05:50.407 Total 300448/s 1173 MiB/s 0 0' 00:05:50.407 09:29:12 -- accel/accel.sh@20 -- # IFS=: 00:05:50.407 09:29:12 -- accel/accel.sh@20 -- # read -r var val 00:05:50.407 09:29:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:05:50.407 09:29:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:05:50.407 09:29:12 -- accel/accel.sh@12 -- # build_accel_config 00:05:50.407 09:29:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:50.407 09:29:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:50.407 09:29:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:50.407 09:29:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:50.407 09:29:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:50.407 09:29:12 -- accel/accel.sh@41 -- # local IFS=, 00:05:50.407 09:29:12 -- accel/accel.sh@42 -- # jq -r . 00:05:50.407 [2024-11-29 09:29:12.905018] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:50.407 [2024-11-29 09:29:12.905110] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3169529 ] 00:05:50.407 EAL: No free 2048 kB hugepages reported on node 1 00:05:50.407 [2024-11-29 09:29:12.975217] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.407 [2024-11-29 09:29:13.043160] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.407 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.407 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.407 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.407 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.407 09:29:13 -- accel/accel.sh@21 -- # val=0x1 00:05:50.407 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.407 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.407 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.407 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.407 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.407 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.407 09:29:13 -- accel/accel.sh@21 -- # val=copy_crc32c 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val=0 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val='8192 bytes' 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val=software 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@23 -- # accel_module=software 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val=32 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val=32 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val=1 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val=Yes 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:50.408 09:29:13 -- accel/accel.sh@21 -- # val= 00:05:50.408 09:29:13 -- accel/accel.sh@22 -- # case "$var" in 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # IFS=: 00:05:50.408 09:29:13 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@21 -- # val= 00:05:51.787 09:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@21 -- # val= 00:05:51.787 09:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@21 -- # val= 00:05:51.787 09:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@21 -- # val= 00:05:51.787 09:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@21 -- # val= 00:05:51.787 09:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@21 -- # val= 00:05:51.787 09:29:14 -- accel/accel.sh@22 -- # case "$var" in 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # IFS=: 00:05:51.787 09:29:14 -- accel/accel.sh@20 -- # read -r var val 00:05:51.787 09:29:14 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:51.787 09:29:14 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:05:51.787 09:29:14 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:51.787 00:05:51.787 real 0m2.658s 00:05:51.787 user 0m2.410s 00:05:51.787 sys 0m0.255s 00:05:51.787 09:29:14 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:51.787 09:29:14 -- common/autotest_common.sh@10 -- # set +x 00:05:51.787 ************************************ 00:05:51.787 END TEST accel_copy_crc32c_C2 00:05:51.787 ************************************ 00:05:51.787 09:29:14 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:05:51.787 09:29:14 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:51.787 09:29:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:51.787 09:29:14 -- common/autotest_common.sh@10 -- # set +x 00:05:51.787 ************************************ 00:05:51.787 START TEST accel_dualcast 00:05:51.787 ************************************ 00:05:51.787 09:29:14 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:05:51.787 09:29:14 -- accel/accel.sh@16 -- # local accel_opc 00:05:51.787 09:29:14 -- accel/accel.sh@17 -- # local accel_module 00:05:51.787 09:29:14 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:05:51.787 09:29:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:51.787 09:29:14 -- accel/accel.sh@12 -- # build_accel_config 00:05:51.787 09:29:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:51.787 09:29:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:51.787 09:29:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:51.787 09:29:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:51.787 09:29:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:51.787 09:29:14 -- accel/accel.sh@41 -- # local IFS=, 00:05:51.787 09:29:14 -- accel/accel.sh@42 -- # jq -r . 00:05:51.787 [2024-11-29 09:29:14.285359] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:51.787 [2024-11-29 09:29:14.285452] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3169810 ] 00:05:51.787 EAL: No free 2048 kB hugepages reported on node 1 00:05:51.787 [2024-11-29 09:29:14.354953] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.787 [2024-11-29 09:29:14.424276] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.168 09:29:15 -- accel/accel.sh@18 -- # out=' 00:05:53.168 SPDK Configuration: 00:05:53.168 Core mask: 0x1 00:05:53.168 00:05:53.168 Accel Perf Configuration: 00:05:53.168 Workload Type: dualcast 00:05:53.168 Transfer size: 4096 bytes 00:05:53.168 Vector count 1 00:05:53.168 Module: software 00:05:53.168 Queue depth: 32 00:05:53.168 Allocate depth: 32 00:05:53.168 # threads/core: 1 00:05:53.168 Run time: 1 seconds 00:05:53.168 Verify: Yes 00:05:53.168 00:05:53.168 Running for 1 seconds... 00:05:53.168 00:05:53.168 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:53.168 ------------------------------------------------------------------------------------ 00:05:53.168 0,0 620832/s 2425 MiB/s 0 0 00:05:53.168 ==================================================================================== 00:05:53.168 Total 620832/s 2425 MiB/s 0 0' 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:05:53.168 09:29:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:05:53.168 09:29:15 -- accel/accel.sh@12 -- # build_accel_config 00:05:53.168 09:29:15 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:53.168 09:29:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:53.168 09:29:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:53.168 09:29:15 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:53.168 09:29:15 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:53.168 09:29:15 -- accel/accel.sh@41 -- # local IFS=, 00:05:53.168 09:29:15 -- accel/accel.sh@42 -- # jq -r . 00:05:53.168 [2024-11-29 09:29:15.614447] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:53.168 [2024-11-29 09:29:15.614542] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170084 ] 00:05:53.168 EAL: No free 2048 kB hugepages reported on node 1 00:05:53.168 [2024-11-29 09:29:15.684557] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:53.168 [2024-11-29 09:29:15.752268] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=0x1 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=dualcast 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=software 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@23 -- # accel_module=software 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=32 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=32 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=1 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val=Yes 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:53.168 09:29:15 -- accel/accel.sh@21 -- # val= 00:05:53.168 09:29:15 -- accel/accel.sh@22 -- # case "$var" in 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # IFS=: 00:05:53.168 09:29:15 -- accel/accel.sh@20 -- # read -r var val 00:05:54.106 09:29:16 -- accel/accel.sh@21 -- # val= 00:05:54.106 09:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.106 09:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:54.107 09:29:16 -- accel/accel.sh@21 -- # val= 00:05:54.107 09:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:54.107 09:29:16 -- accel/accel.sh@21 -- # val= 00:05:54.107 09:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:54.107 09:29:16 -- accel/accel.sh@21 -- # val= 00:05:54.107 09:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:54.107 09:29:16 -- accel/accel.sh@21 -- # val= 00:05:54.107 09:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:54.107 09:29:16 -- accel/accel.sh@21 -- # val= 00:05:54.107 09:29:16 -- accel/accel.sh@22 -- # case "$var" in 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # IFS=: 00:05:54.107 09:29:16 -- accel/accel.sh@20 -- # read -r var val 00:05:54.107 09:29:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:54.107 09:29:16 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:05:54.107 09:29:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:54.107 00:05:54.107 real 0m2.661s 00:05:54.107 user 0m2.400s 00:05:54.107 sys 0m0.268s 00:05:54.107 09:29:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:54.107 09:29:16 -- common/autotest_common.sh@10 -- # set +x 00:05:54.107 ************************************ 00:05:54.107 END TEST accel_dualcast 00:05:54.107 ************************************ 00:05:54.366 09:29:16 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:05:54.366 09:29:16 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:54.366 09:29:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:54.366 09:29:16 -- common/autotest_common.sh@10 -- # set +x 00:05:54.366 ************************************ 00:05:54.366 START TEST accel_compare 00:05:54.366 ************************************ 00:05:54.366 09:29:16 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:05:54.366 09:29:16 -- accel/accel.sh@16 -- # local accel_opc 00:05:54.366 09:29:16 -- accel/accel.sh@17 -- # local accel_module 00:05:54.366 09:29:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:05:54.366 09:29:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:54.366 09:29:16 -- accel/accel.sh@12 -- # build_accel_config 00:05:54.366 09:29:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:54.366 09:29:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:54.366 09:29:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:54.366 09:29:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:54.366 09:29:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:54.366 09:29:16 -- accel/accel.sh@41 -- # local IFS=, 00:05:54.366 09:29:16 -- accel/accel.sh@42 -- # jq -r . 00:05:54.366 [2024-11-29 09:29:16.982699] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:54.366 [2024-11-29 09:29:16.982761] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170369 ] 00:05:54.366 EAL: No free 2048 kB hugepages reported on node 1 00:05:54.366 [2024-11-29 09:29:17.041756] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.366 [2024-11-29 09:29:17.111585] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.745 09:29:18 -- accel/accel.sh@18 -- # out=' 00:05:55.745 SPDK Configuration: 00:05:55.745 Core mask: 0x1 00:05:55.745 00:05:55.745 Accel Perf Configuration: 00:05:55.745 Workload Type: compare 00:05:55.745 Transfer size: 4096 bytes 00:05:55.746 Vector count 1 00:05:55.746 Module: software 00:05:55.746 Queue depth: 32 00:05:55.746 Allocate depth: 32 00:05:55.746 # threads/core: 1 00:05:55.746 Run time: 1 seconds 00:05:55.746 Verify: Yes 00:05:55.746 00:05:55.746 Running for 1 seconds... 00:05:55.746 00:05:55.746 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:55.746 ------------------------------------------------------------------------------------ 00:05:55.746 0,0 801280/s 3130 MiB/s 0 0 00:05:55.746 ==================================================================================== 00:05:55.746 Total 801280/s 3130 MiB/s 0 0' 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:05:55.746 09:29:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:05:55.746 09:29:18 -- accel/accel.sh@12 -- # build_accel_config 00:05:55.746 09:29:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:55.746 09:29:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:55.746 09:29:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:55.746 09:29:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:55.746 09:29:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:55.746 09:29:18 -- accel/accel.sh@41 -- # local IFS=, 00:05:55.746 09:29:18 -- accel/accel.sh@42 -- # jq -r . 00:05:55.746 [2024-11-29 09:29:18.297979] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:55.746 [2024-11-29 09:29:18.298071] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170594 ] 00:05:55.746 EAL: No free 2048 kB hugepages reported on node 1 00:05:55.746 [2024-11-29 09:29:18.368940] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.746 [2024-11-29 09:29:18.438612] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=0x1 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=compare 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@24 -- # accel_opc=compare 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=software 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@23 -- # accel_module=software 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=32 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=32 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=1 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val=Yes 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:55.746 09:29:18 -- accel/accel.sh@21 -- # val= 00:05:55.746 09:29:18 -- accel/accel.sh@22 -- # case "$var" in 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # IFS=: 00:05:55.746 09:29:18 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@21 -- # val= 00:05:57.126 09:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@21 -- # val= 00:05:57.126 09:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@21 -- # val= 00:05:57.126 09:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@21 -- # val= 00:05:57.126 09:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@21 -- # val= 00:05:57.126 09:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@21 -- # val= 00:05:57.126 09:29:19 -- accel/accel.sh@22 -- # case "$var" in 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # IFS=: 00:05:57.126 09:29:19 -- accel/accel.sh@20 -- # read -r var val 00:05:57.126 09:29:19 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:57.126 09:29:19 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:05:57.126 09:29:19 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:57.126 00:05:57.126 real 0m2.640s 00:05:57.126 user 0m2.408s 00:05:57.126 sys 0m0.242s 00:05:57.126 09:29:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:57.126 09:29:19 -- common/autotest_common.sh@10 -- # set +x 00:05:57.126 ************************************ 00:05:57.126 END TEST accel_compare 00:05:57.126 ************************************ 00:05:57.126 09:29:19 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:05:57.126 09:29:19 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:05:57.126 09:29:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:57.126 09:29:19 -- common/autotest_common.sh@10 -- # set +x 00:05:57.126 ************************************ 00:05:57.126 START TEST accel_xor 00:05:57.126 ************************************ 00:05:57.126 09:29:19 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:05:57.126 09:29:19 -- accel/accel.sh@16 -- # local accel_opc 00:05:57.126 09:29:19 -- accel/accel.sh@17 -- # local accel_module 00:05:57.126 09:29:19 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:05:57.126 09:29:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:05:57.126 09:29:19 -- accel/accel.sh@12 -- # build_accel_config 00:05:57.126 09:29:19 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:57.126 09:29:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:57.126 09:29:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:57.126 09:29:19 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:57.126 09:29:19 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:57.126 09:29:19 -- accel/accel.sh@41 -- # local IFS=, 00:05:57.126 09:29:19 -- accel/accel.sh@42 -- # jq -r . 00:05:57.126 [2024-11-29 09:29:19.665377] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:57.126 [2024-11-29 09:29:19.665442] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170816 ] 00:05:57.126 EAL: No free 2048 kB hugepages reported on node 1 00:05:57.126 [2024-11-29 09:29:19.729615] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.126 [2024-11-29 09:29:19.799396] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.506 09:29:20 -- accel/accel.sh@18 -- # out=' 00:05:58.506 SPDK Configuration: 00:05:58.506 Core mask: 0x1 00:05:58.506 00:05:58.506 Accel Perf Configuration: 00:05:58.506 Workload Type: xor 00:05:58.506 Source buffers: 2 00:05:58.506 Transfer size: 4096 bytes 00:05:58.506 Vector count 1 00:05:58.506 Module: software 00:05:58.506 Queue depth: 32 00:05:58.506 Allocate depth: 32 00:05:58.506 # threads/core: 1 00:05:58.506 Run time: 1 seconds 00:05:58.506 Verify: Yes 00:05:58.506 00:05:58.506 Running for 1 seconds... 00:05:58.506 00:05:58.506 Core,Thread Transfers Bandwidth Failed Miscompares 00:05:58.506 ------------------------------------------------------------------------------------ 00:05:58.506 0,0 685408/s 2677 MiB/s 0 0 00:05:58.506 ==================================================================================== 00:05:58.506 Total 685408/s 2677 MiB/s 0 0' 00:05:58.506 09:29:20 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:20 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:05:58.506 09:29:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:05:58.506 09:29:20 -- accel/accel.sh@12 -- # build_accel_config 00:05:58.506 09:29:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:58.506 09:29:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:58.506 09:29:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:58.506 09:29:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:58.506 09:29:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:58.506 09:29:20 -- accel/accel.sh@41 -- # local IFS=, 00:05:58.506 09:29:20 -- accel/accel.sh@42 -- # jq -r . 00:05:58.506 [2024-11-29 09:29:20.988259] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:58.506 [2024-11-29 09:29:20.988350] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3170969 ] 00:05:58.506 EAL: No free 2048 kB hugepages reported on node 1 00:05:58.506 [2024-11-29 09:29:21.057634] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:58.506 [2024-11-29 09:29:21.126604] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=0x1 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=xor 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@24 -- # accel_opc=xor 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=2 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val='4096 bytes' 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=software 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@23 -- # accel_module=software 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=32 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=32 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val=1 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.506 09:29:21 -- accel/accel.sh@21 -- # val='1 seconds' 00:05:58.506 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.506 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.507 09:29:21 -- accel/accel.sh@21 -- # val=Yes 00:05:58.507 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.507 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.507 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.507 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.507 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.507 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.507 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:58.507 09:29:21 -- accel/accel.sh@21 -- # val= 00:05:58.507 09:29:21 -- accel/accel.sh@22 -- # case "$var" in 00:05:58.507 09:29:21 -- accel/accel.sh@20 -- # IFS=: 00:05:58.507 09:29:21 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@21 -- # val= 00:05:59.887 09:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@21 -- # val= 00:05:59.887 09:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@21 -- # val= 00:05:59.887 09:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@21 -- # val= 00:05:59.887 09:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@21 -- # val= 00:05:59.887 09:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@21 -- # val= 00:05:59.887 09:29:22 -- accel/accel.sh@22 -- # case "$var" in 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # IFS=: 00:05:59.887 09:29:22 -- accel/accel.sh@20 -- # read -r var val 00:05:59.887 09:29:22 -- accel/accel.sh@28 -- # [[ -n software ]] 00:05:59.887 09:29:22 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:05:59.887 09:29:22 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:05:59.887 00:05:59.887 real 0m2.647s 00:05:59.887 user 0m2.403s 00:05:59.887 sys 0m0.253s 00:05:59.887 09:29:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:59.887 09:29:22 -- common/autotest_common.sh@10 -- # set +x 00:05:59.887 ************************************ 00:05:59.887 END TEST accel_xor 00:05:59.887 ************************************ 00:05:59.887 09:29:22 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:05:59.887 09:29:22 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:05:59.887 09:29:22 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:59.887 09:29:22 -- common/autotest_common.sh@10 -- # set +x 00:05:59.888 ************************************ 00:05:59.888 START TEST accel_xor 00:05:59.888 ************************************ 00:05:59.888 09:29:22 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:05:59.888 09:29:22 -- accel/accel.sh@16 -- # local accel_opc 00:05:59.888 09:29:22 -- accel/accel.sh@17 -- # local accel_module 00:05:59.888 09:29:22 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:05:59.888 09:29:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:05:59.888 09:29:22 -- accel/accel.sh@12 -- # build_accel_config 00:05:59.888 09:29:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:05:59.888 09:29:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:05:59.888 09:29:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:05:59.888 09:29:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:05:59.888 09:29:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:05:59.888 09:29:22 -- accel/accel.sh@41 -- # local IFS=, 00:05:59.888 09:29:22 -- accel/accel.sh@42 -- # jq -r . 00:05:59.888 [2024-11-29 09:29:22.367514] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:59.888 [2024-11-29 09:29:22.367591] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3171233 ] 00:05:59.888 EAL: No free 2048 kB hugepages reported on node 1 00:05:59.888 [2024-11-29 09:29:22.436112] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:59.888 [2024-11-29 09:29:22.505802] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.267 09:29:23 -- accel/accel.sh@18 -- # out=' 00:06:01.267 SPDK Configuration: 00:06:01.267 Core mask: 0x1 00:06:01.267 00:06:01.267 Accel Perf Configuration: 00:06:01.267 Workload Type: xor 00:06:01.267 Source buffers: 3 00:06:01.267 Transfer size: 4096 bytes 00:06:01.267 Vector count 1 00:06:01.267 Module: software 00:06:01.267 Queue depth: 32 00:06:01.267 Allocate depth: 32 00:06:01.267 # threads/core: 1 00:06:01.267 Run time: 1 seconds 00:06:01.267 Verify: Yes 00:06:01.267 00:06:01.267 Running for 1 seconds... 00:06:01.267 00:06:01.267 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:01.267 ------------------------------------------------------------------------------------ 00:06:01.267 0,0 665664/s 2600 MiB/s 0 0 00:06:01.267 ==================================================================================== 00:06:01.267 Total 665664/s 2600 MiB/s 0 0' 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:01.267 09:29:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:01.267 09:29:23 -- accel/accel.sh@12 -- # build_accel_config 00:06:01.267 09:29:23 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:01.267 09:29:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:01.267 09:29:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:01.267 09:29:23 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:01.267 09:29:23 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:01.267 09:29:23 -- accel/accel.sh@41 -- # local IFS=, 00:06:01.267 09:29:23 -- accel/accel.sh@42 -- # jq -r . 00:06:01.267 [2024-11-29 09:29:23.695005] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:01.267 [2024-11-29 09:29:23.695096] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3171499 ] 00:06:01.267 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.267 [2024-11-29 09:29:23.765894] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.267 [2024-11-29 09:29:23.834256] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=0x1 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=xor 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=3 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=software 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=32 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=32 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=1 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val=Yes 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:01.267 09:29:23 -- accel/accel.sh@21 -- # val= 00:06:01.267 09:29:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # IFS=: 00:06:01.267 09:29:23 -- accel/accel.sh@20 -- # read -r var val 00:06:02.205 09:29:24 -- accel/accel.sh@21 -- # val= 00:06:02.205 09:29:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.205 09:29:24 -- accel/accel.sh@20 -- # IFS=: 00:06:02.205 09:29:24 -- accel/accel.sh@20 -- # read -r var val 00:06:02.205 09:29:25 -- accel/accel.sh@21 -- # val= 00:06:02.205 09:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # IFS=: 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # read -r var val 00:06:02.205 09:29:25 -- accel/accel.sh@21 -- # val= 00:06:02.205 09:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # IFS=: 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # read -r var val 00:06:02.205 09:29:25 -- accel/accel.sh@21 -- # val= 00:06:02.205 09:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # IFS=: 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # read -r var val 00:06:02.205 09:29:25 -- accel/accel.sh@21 -- # val= 00:06:02.205 09:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.205 09:29:25 -- accel/accel.sh@20 -- # IFS=: 00:06:02.206 09:29:25 -- accel/accel.sh@20 -- # read -r var val 00:06:02.206 09:29:25 -- accel/accel.sh@21 -- # val= 00:06:02.206 09:29:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:02.206 09:29:25 -- accel/accel.sh@20 -- # IFS=: 00:06:02.206 09:29:25 -- accel/accel.sh@20 -- # read -r var val 00:06:02.206 09:29:25 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:02.206 09:29:25 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:02.206 09:29:25 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:02.206 00:06:02.206 real 0m2.661s 00:06:02.206 user 0m2.427s 00:06:02.206 sys 0m0.241s 00:06:02.206 09:29:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.206 09:29:25 -- common/autotest_common.sh@10 -- # set +x 00:06:02.206 ************************************ 00:06:02.206 END TEST accel_xor 00:06:02.206 ************************************ 00:06:02.465 09:29:25 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:02.465 09:29:25 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:02.465 09:29:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.465 09:29:25 -- common/autotest_common.sh@10 -- # set +x 00:06:02.465 ************************************ 00:06:02.465 START TEST accel_dif_verify 00:06:02.465 ************************************ 00:06:02.465 09:29:25 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:06:02.465 09:29:25 -- accel/accel.sh@16 -- # local accel_opc 00:06:02.465 09:29:25 -- accel/accel.sh@17 -- # local accel_module 00:06:02.465 09:29:25 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:02.465 09:29:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:02.465 09:29:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:02.465 09:29:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:02.465 09:29:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:02.465 09:29:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:02.465 09:29:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:02.465 09:29:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:02.465 09:29:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:02.465 09:29:25 -- accel/accel.sh@42 -- # jq -r . 00:06:02.465 [2024-11-29 09:29:25.078040] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:02.465 [2024-11-29 09:29:25.078134] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3171788 ] 00:06:02.465 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.465 [2024-11-29 09:29:25.146777] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.465 [2024-11-29 09:29:25.215772] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.844 09:29:26 -- accel/accel.sh@18 -- # out=' 00:06:03.844 SPDK Configuration: 00:06:03.844 Core mask: 0x1 00:06:03.844 00:06:03.844 Accel Perf Configuration: 00:06:03.844 Workload Type: dif_verify 00:06:03.844 Vector size: 4096 bytes 00:06:03.844 Transfer size: 4096 bytes 00:06:03.844 Block size: 512 bytes 00:06:03.844 Metadata size: 8 bytes 00:06:03.844 Vector count 1 00:06:03.844 Module: software 00:06:03.844 Queue depth: 32 00:06:03.844 Allocate depth: 32 00:06:03.844 # threads/core: 1 00:06:03.844 Run time: 1 seconds 00:06:03.844 Verify: No 00:06:03.844 00:06:03.844 Running for 1 seconds... 00:06:03.844 00:06:03.844 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:03.844 ------------------------------------------------------------------------------------ 00:06:03.844 0,0 246368/s 977 MiB/s 0 0 00:06:03.844 ==================================================================================== 00:06:03.844 Total 246368/s 962 MiB/s 0 0' 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:03.844 09:29:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:03.844 09:29:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:03.844 09:29:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:03.844 09:29:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:03.844 09:29:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:03.844 09:29:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:03.844 09:29:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:03.844 09:29:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:03.844 09:29:26 -- accel/accel.sh@42 -- # jq -r . 00:06:03.844 [2024-11-29 09:29:26.404128] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:03.844 [2024-11-29 09:29:26.404216] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172054 ] 00:06:03.844 EAL: No free 2048 kB hugepages reported on node 1 00:06:03.844 [2024-11-29 09:29:26.474740] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:03.844 [2024-11-29 09:29:26.542713] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=0x1 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=dif_verify 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=software 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@23 -- # accel_module=software 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=32 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=32 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=1 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val=No 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:03.844 09:29:26 -- accel/accel.sh@21 -- # val= 00:06:03.844 09:29:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # IFS=: 00:06:03.844 09:29:26 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@21 -- # val= 00:06:05.224 09:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@21 -- # val= 00:06:05.224 09:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@21 -- # val= 00:06:05.224 09:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@21 -- # val= 00:06:05.224 09:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@21 -- # val= 00:06:05.224 09:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@21 -- # val= 00:06:05.224 09:29:27 -- accel/accel.sh@22 -- # case "$var" in 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # IFS=: 00:06:05.224 09:29:27 -- accel/accel.sh@20 -- # read -r var val 00:06:05.224 09:29:27 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:05.224 09:29:27 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:05.224 09:29:27 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:05.224 00:06:05.224 real 0m2.660s 00:06:05.224 user 0m2.401s 00:06:05.224 sys 0m0.267s 00:06:05.224 09:29:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:05.224 09:29:27 -- common/autotest_common.sh@10 -- # set +x 00:06:05.224 ************************************ 00:06:05.224 END TEST accel_dif_verify 00:06:05.224 ************************************ 00:06:05.224 09:29:27 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:05.224 09:29:27 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:05.224 09:29:27 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:05.224 09:29:27 -- common/autotest_common.sh@10 -- # set +x 00:06:05.224 ************************************ 00:06:05.224 START TEST accel_dif_generate 00:06:05.224 ************************************ 00:06:05.224 09:29:27 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:06:05.224 09:29:27 -- accel/accel.sh@16 -- # local accel_opc 00:06:05.224 09:29:27 -- accel/accel.sh@17 -- # local accel_module 00:06:05.224 09:29:27 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:05.224 09:29:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:05.224 09:29:27 -- accel/accel.sh@12 -- # build_accel_config 00:06:05.224 09:29:27 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:05.224 09:29:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:05.224 09:29:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:05.224 09:29:27 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:05.224 09:29:27 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:05.224 09:29:27 -- accel/accel.sh@41 -- # local IFS=, 00:06:05.224 09:29:27 -- accel/accel.sh@42 -- # jq -r . 00:06:05.224 [2024-11-29 09:29:27.788364] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:05.224 [2024-11-29 09:29:27.788449] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172342 ] 00:06:05.224 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.224 [2024-11-29 09:29:27.857199] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.224 [2024-11-29 09:29:27.926608] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.604 09:29:29 -- accel/accel.sh@18 -- # out=' 00:06:06.604 SPDK Configuration: 00:06:06.604 Core mask: 0x1 00:06:06.604 00:06:06.604 Accel Perf Configuration: 00:06:06.604 Workload Type: dif_generate 00:06:06.604 Vector size: 4096 bytes 00:06:06.604 Transfer size: 4096 bytes 00:06:06.604 Block size: 512 bytes 00:06:06.604 Metadata size: 8 bytes 00:06:06.604 Vector count 1 00:06:06.604 Module: software 00:06:06.604 Queue depth: 32 00:06:06.604 Allocate depth: 32 00:06:06.604 # threads/core: 1 00:06:06.604 Run time: 1 seconds 00:06:06.604 Verify: No 00:06:06.604 00:06:06.604 Running for 1 seconds... 00:06:06.604 00:06:06.604 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:06.604 ------------------------------------------------------------------------------------ 00:06:06.604 0,0 287456/s 1140 MiB/s 0 0 00:06:06.604 ==================================================================================== 00:06:06.604 Total 287456/s 1122 MiB/s 0 0' 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:06.604 09:29:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:06.604 09:29:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:06.604 09:29:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:06.604 09:29:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:06.604 09:29:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:06.604 09:29:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:06.604 09:29:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:06.604 09:29:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:06.604 09:29:29 -- accel/accel.sh@42 -- # jq -r . 00:06:06.604 [2024-11-29 09:29:29.115658] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:06.604 [2024-11-29 09:29:29.115750] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172614 ] 00:06:06.604 EAL: No free 2048 kB hugepages reported on node 1 00:06:06.604 [2024-11-29 09:29:29.185350] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.604 [2024-11-29 09:29:29.253665] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val=0x1 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val=dif_generate 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val=software 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@23 -- # accel_module=software 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val=32 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val=32 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val=1 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.604 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.604 09:29:29 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:06.604 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.605 09:29:29 -- accel/accel.sh@21 -- # val=No 00:06:06.605 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.605 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.605 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:06.605 09:29:29 -- accel/accel.sh@21 -- # val= 00:06:06.605 09:29:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # IFS=: 00:06:06.605 09:29:29 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@21 -- # val= 00:06:07.984 09:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@21 -- # val= 00:06:07.984 09:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@21 -- # val= 00:06:07.984 09:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@21 -- # val= 00:06:07.984 09:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@21 -- # val= 00:06:07.984 09:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@21 -- # val= 00:06:07.984 09:29:30 -- accel/accel.sh@22 -- # case "$var" in 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # IFS=: 00:06:07.984 09:29:30 -- accel/accel.sh@20 -- # read -r var val 00:06:07.984 09:29:30 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:07.984 09:29:30 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:07.984 09:29:30 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:07.984 00:06:07.984 real 0m2.663s 00:06:07.984 user 0m2.420s 00:06:07.984 sys 0m0.252s 00:06:07.984 09:29:30 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:07.984 09:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:07.984 ************************************ 00:06:07.985 END TEST accel_dif_generate 00:06:07.985 ************************************ 00:06:07.985 09:29:30 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:07.985 09:29:30 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:07.985 09:29:30 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.985 09:29:30 -- common/autotest_common.sh@10 -- # set +x 00:06:07.985 ************************************ 00:06:07.985 START TEST accel_dif_generate_copy 00:06:07.985 ************************************ 00:06:07.985 09:29:30 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:06:07.985 09:29:30 -- accel/accel.sh@16 -- # local accel_opc 00:06:07.985 09:29:30 -- accel/accel.sh@17 -- # local accel_module 00:06:07.985 09:29:30 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:07.985 09:29:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:07.985 09:29:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:07.985 09:29:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:07.985 09:29:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:07.985 09:29:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:07.985 09:29:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:07.985 09:29:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:07.985 09:29:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:07.985 09:29:30 -- accel/accel.sh@42 -- # jq -r . 00:06:07.985 [2024-11-29 09:29:30.493117] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:07.985 [2024-11-29 09:29:30.493207] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3172841 ] 00:06:07.985 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.985 [2024-11-29 09:29:30.561669] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.985 [2024-11-29 09:29:30.631157] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.362 09:29:31 -- accel/accel.sh@18 -- # out=' 00:06:09.362 SPDK Configuration: 00:06:09.362 Core mask: 0x1 00:06:09.362 00:06:09.362 Accel Perf Configuration: 00:06:09.362 Workload Type: dif_generate_copy 00:06:09.362 Vector size: 4096 bytes 00:06:09.362 Transfer size: 4096 bytes 00:06:09.362 Vector count 1 00:06:09.362 Module: software 00:06:09.362 Queue depth: 32 00:06:09.362 Allocate depth: 32 00:06:09.362 # threads/core: 1 00:06:09.362 Run time: 1 seconds 00:06:09.362 Verify: No 00:06:09.362 00:06:09.362 Running for 1 seconds... 00:06:09.362 00:06:09.362 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:09.362 ------------------------------------------------------------------------------------ 00:06:09.362 0,0 225984/s 896 MiB/s 0 0 00:06:09.362 ==================================================================================== 00:06:09.362 Total 225984/s 882 MiB/s 0 0' 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:09.362 09:29:31 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:09.362 09:29:31 -- accel/accel.sh@12 -- # build_accel_config 00:06:09.362 09:29:31 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:09.362 09:29:31 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:09.362 09:29:31 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:09.362 09:29:31 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:09.362 09:29:31 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:09.362 09:29:31 -- accel/accel.sh@41 -- # local IFS=, 00:06:09.362 09:29:31 -- accel/accel.sh@42 -- # jq -r . 00:06:09.362 [2024-11-29 09:29:31.809926] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:09.362 [2024-11-29 09:29:31.809994] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173003 ] 00:06:09.362 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.362 [2024-11-29 09:29:31.875416] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.362 [2024-11-29 09:29:31.945274] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val=0x1 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.362 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.362 09:29:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:09.362 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val=software 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val=32 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val=32 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val=1 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val=No 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:09.363 09:29:31 -- accel/accel.sh@21 -- # val= 00:06:09.363 09:29:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # IFS=: 00:06:09.363 09:29:31 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@21 -- # val= 00:06:10.300 09:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@21 -- # val= 00:06:10.300 09:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@21 -- # val= 00:06:10.300 09:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@21 -- # val= 00:06:10.300 09:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@21 -- # val= 00:06:10.300 09:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@21 -- # val= 00:06:10.300 09:29:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # IFS=: 00:06:10.300 09:29:33 -- accel/accel.sh@20 -- # read -r var val 00:06:10.300 09:29:33 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:10.300 09:29:33 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:06:10.300 09:29:33 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:10.300 00:06:10.300 real 0m2.641s 00:06:10.300 user 0m2.388s 00:06:10.300 sys 0m0.250s 00:06:10.300 09:29:33 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:10.300 09:29:33 -- common/autotest_common.sh@10 -- # set +x 00:06:10.300 ************************************ 00:06:10.300 END TEST accel_dif_generate_copy 00:06:10.300 ************************************ 00:06:10.559 09:29:33 -- accel/accel.sh@107 -- # [[ y == y ]] 00:06:10.559 09:29:33 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:10.559 09:29:33 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:10.559 09:29:33 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:10.559 09:29:33 -- common/autotest_common.sh@10 -- # set +x 00:06:10.559 ************************************ 00:06:10.559 START TEST accel_comp 00:06:10.559 ************************************ 00:06:10.559 09:29:33 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:10.559 09:29:33 -- accel/accel.sh@16 -- # local accel_opc 00:06:10.559 09:29:33 -- accel/accel.sh@17 -- # local accel_module 00:06:10.559 09:29:33 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:10.559 09:29:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:10.559 09:29:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:10.559 09:29:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:10.559 09:29:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:10.559 09:29:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:10.559 09:29:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:10.559 09:29:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:10.559 09:29:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:10.559 09:29:33 -- accel/accel.sh@42 -- # jq -r . 00:06:10.559 [2024-11-29 09:29:33.177428] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:10.559 [2024-11-29 09:29:33.177518] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173211 ] 00:06:10.559 EAL: No free 2048 kB hugepages reported on node 1 00:06:10.559 [2024-11-29 09:29:33.250557] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.559 [2024-11-29 09:29:33.320738] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.939 09:29:34 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:11.939 00:06:11.939 SPDK Configuration: 00:06:11.939 Core mask: 0x1 00:06:11.939 00:06:11.939 Accel Perf Configuration: 00:06:11.939 Workload Type: compress 00:06:11.939 Transfer size: 4096 bytes 00:06:11.939 Vector count 1 00:06:11.939 Module: software 00:06:11.939 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.939 Queue depth: 32 00:06:11.939 Allocate depth: 32 00:06:11.939 # threads/core: 1 00:06:11.939 Run time: 1 seconds 00:06:11.939 Verify: No 00:06:11.939 00:06:11.939 Running for 1 seconds... 00:06:11.939 00:06:11.939 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:11.939 ------------------------------------------------------------------------------------ 00:06:11.939 0,0 64800/s 270 MiB/s 0 0 00:06:11.939 ==================================================================================== 00:06:11.939 Total 64800/s 253 MiB/s 0 0' 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.939 09:29:34 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.939 09:29:34 -- accel/accel.sh@12 -- # build_accel_config 00:06:11.939 09:29:34 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:11.939 09:29:34 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:11.939 09:29:34 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:11.939 09:29:34 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:11.939 09:29:34 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:11.939 09:29:34 -- accel/accel.sh@41 -- # local IFS=, 00:06:11.939 09:29:34 -- accel/accel.sh@42 -- # jq -r . 00:06:11.939 [2024-11-29 09:29:34.500663] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:11.939 [2024-11-29 09:29:34.500730] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173473 ] 00:06:11.939 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.939 [2024-11-29 09:29:34.566285] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.939 [2024-11-29 09:29:34.635273] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=0x1 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=compress 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@24 -- # accel_opc=compress 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=software 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@23 -- # accel_module=software 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=32 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=32 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=1 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val=No 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:11.939 09:29:34 -- accel/accel.sh@21 -- # val= 00:06:11.939 09:29:34 -- accel/accel.sh@22 -- # case "$var" in 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # IFS=: 00:06:11.939 09:29:34 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@21 -- # val= 00:06:13.317 09:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@21 -- # val= 00:06:13.317 09:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@21 -- # val= 00:06:13.317 09:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@21 -- # val= 00:06:13.317 09:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@21 -- # val= 00:06:13.317 09:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@21 -- # val= 00:06:13.317 09:29:35 -- accel/accel.sh@22 -- # case "$var" in 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # IFS=: 00:06:13.317 09:29:35 -- accel/accel.sh@20 -- # read -r var val 00:06:13.317 09:29:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:13.317 09:29:35 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:06:13.317 09:29:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:13.317 00:06:13.317 real 0m2.652s 00:06:13.317 user 0m2.402s 00:06:13.317 sys 0m0.246s 00:06:13.317 09:29:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:13.317 09:29:35 -- common/autotest_common.sh@10 -- # set +x 00:06:13.317 ************************************ 00:06:13.317 END TEST accel_comp 00:06:13.317 ************************************ 00:06:13.317 09:29:35 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:13.317 09:29:35 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:13.317 09:29:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:13.317 09:29:35 -- common/autotest_common.sh@10 -- # set +x 00:06:13.317 ************************************ 00:06:13.317 START TEST accel_decomp 00:06:13.317 ************************************ 00:06:13.317 09:29:35 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:13.317 09:29:35 -- accel/accel.sh@16 -- # local accel_opc 00:06:13.317 09:29:35 -- accel/accel.sh@17 -- # local accel_module 00:06:13.317 09:29:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:13.317 09:29:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:13.317 09:29:35 -- accel/accel.sh@12 -- # build_accel_config 00:06:13.317 09:29:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:13.317 09:29:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:13.317 09:29:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:13.317 09:29:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:13.317 09:29:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:13.317 09:29:35 -- accel/accel.sh@41 -- # local IFS=, 00:06:13.317 09:29:35 -- accel/accel.sh@42 -- # jq -r . 00:06:13.317 [2024-11-29 09:29:35.866474] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:13.317 [2024-11-29 09:29:35.866555] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3173757 ] 00:06:13.317 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.317 [2024-11-29 09:29:35.934509] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.317 [2024-11-29 09:29:36.003225] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.696 09:29:37 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:14.696 00:06:14.696 SPDK Configuration: 00:06:14.696 Core mask: 0x1 00:06:14.696 00:06:14.696 Accel Perf Configuration: 00:06:14.696 Workload Type: decompress 00:06:14.696 Transfer size: 4096 bytes 00:06:14.696 Vector count 1 00:06:14.696 Module: software 00:06:14.696 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:14.696 Queue depth: 32 00:06:14.696 Allocate depth: 32 00:06:14.696 # threads/core: 1 00:06:14.696 Run time: 1 seconds 00:06:14.696 Verify: Yes 00:06:14.696 00:06:14.696 Running for 1 seconds... 00:06:14.696 00:06:14.696 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:14.696 ------------------------------------------------------------------------------------ 00:06:14.696 0,0 90688/s 167 MiB/s 0 0 00:06:14.696 ==================================================================================== 00:06:14.696 Total 90688/s 354 MiB/s 0 0' 00:06:14.696 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.696 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.696 09:29:37 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:14.696 09:29:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:14.696 09:29:37 -- accel/accel.sh@12 -- # build_accel_config 00:06:14.696 09:29:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:14.696 09:29:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:14.696 09:29:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:14.696 09:29:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:14.696 09:29:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:14.697 09:29:37 -- accel/accel.sh@41 -- # local IFS=, 00:06:14.697 09:29:37 -- accel/accel.sh@42 -- # jq -r . 00:06:14.697 [2024-11-29 09:29:37.182460] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.697 [2024-11-29 09:29:37.182523] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174032 ] 00:06:14.697 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.697 [2024-11-29 09:29:37.248190] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:14.697 [2024-11-29 09:29:37.315976] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=0x1 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=decompress 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=software 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@23 -- # accel_module=software 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=32 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=32 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=1 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val=Yes 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:14.697 09:29:37 -- accel/accel.sh@21 -- # val= 00:06:14.697 09:29:37 -- accel/accel.sh@22 -- # case "$var" in 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # IFS=: 00:06:14.697 09:29:37 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@21 -- # val= 00:06:16.076 09:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@21 -- # val= 00:06:16.076 09:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@21 -- # val= 00:06:16.076 09:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@21 -- # val= 00:06:16.076 09:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@21 -- # val= 00:06:16.076 09:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@21 -- # val= 00:06:16.076 09:29:38 -- accel/accel.sh@22 -- # case "$var" in 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # IFS=: 00:06:16.076 09:29:38 -- accel/accel.sh@20 -- # read -r var val 00:06:16.076 09:29:38 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:16.076 09:29:38 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:16.076 09:29:38 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:16.076 00:06:16.076 real 0m2.640s 00:06:16.076 user 0m2.391s 00:06:16.076 sys 0m0.244s 00:06:16.076 09:29:38 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:16.076 09:29:38 -- common/autotest_common.sh@10 -- # set +x 00:06:16.076 ************************************ 00:06:16.076 END TEST accel_decomp 00:06:16.076 ************************************ 00:06:16.077 09:29:38 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.077 09:29:38 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:16.077 09:29:38 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:16.077 09:29:38 -- common/autotest_common.sh@10 -- # set +x 00:06:16.077 ************************************ 00:06:16.077 START TEST accel_decmop_full 00:06:16.077 ************************************ 00:06:16.077 09:29:38 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.077 09:29:38 -- accel/accel.sh@16 -- # local accel_opc 00:06:16.077 09:29:38 -- accel/accel.sh@17 -- # local accel_module 00:06:16.077 09:29:38 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.077 09:29:38 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:16.077 09:29:38 -- accel/accel.sh@12 -- # build_accel_config 00:06:16.077 09:29:38 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:16.077 09:29:38 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:16.077 09:29:38 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:16.077 09:29:38 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:16.077 09:29:38 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:16.077 09:29:38 -- accel/accel.sh@41 -- # local IFS=, 00:06:16.077 09:29:38 -- accel/accel.sh@42 -- # jq -r . 00:06:16.077 [2024-11-29 09:29:38.541627] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.077 [2024-11-29 09:29:38.541715] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174316 ] 00:06:16.077 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.077 [2024-11-29 09:29:38.610524] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.077 [2024-11-29 09:29:38.678982] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.456 09:29:39 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:17.456 00:06:17.456 SPDK Configuration: 00:06:17.456 Core mask: 0x1 00:06:17.456 00:06:17.456 Accel Perf Configuration: 00:06:17.456 Workload Type: decompress 00:06:17.456 Transfer size: 111250 bytes 00:06:17.456 Vector count 1 00:06:17.456 Module: software 00:06:17.456 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:17.456 Queue depth: 32 00:06:17.456 Allocate depth: 32 00:06:17.456 # threads/core: 1 00:06:17.456 Run time: 1 seconds 00:06:17.456 Verify: Yes 00:06:17.456 00:06:17.456 Running for 1 seconds... 00:06:17.456 00:06:17.456 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:17.456 ------------------------------------------------------------------------------------ 00:06:17.456 0,0 5856/s 241 MiB/s 0 0 00:06:17.456 ==================================================================================== 00:06:17.456 Total 5856/s 621 MiB/s 0 0' 00:06:17.456 09:29:39 -- accel/accel.sh@20 -- # IFS=: 00:06:17.456 09:29:39 -- accel/accel.sh@20 -- # read -r var val 00:06:17.456 09:29:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:17.456 09:29:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:06:17.456 09:29:39 -- accel/accel.sh@12 -- # build_accel_config 00:06:17.456 09:29:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:17.456 09:29:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:17.456 09:29:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:17.456 09:29:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:17.456 09:29:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:17.456 09:29:39 -- accel/accel.sh@41 -- # local IFS=, 00:06:17.456 09:29:39 -- accel/accel.sh@42 -- # jq -r . 00:06:17.456 [2024-11-29 09:29:39.868512] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:17.456 [2024-11-29 09:29:39.868576] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174584 ] 00:06:17.456 EAL: No free 2048 kB hugepages reported on node 1 00:06:17.456 [2024-11-29 09:29:39.931542] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:17.456 [2024-11-29 09:29:40.000651] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.456 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.456 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.456 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.456 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.456 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.456 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.456 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.456 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.456 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.456 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.456 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=0x1 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=decompress 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=software 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@23 -- # accel_module=software 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=32 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=32 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=1 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val=Yes 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:17.457 09:29:40 -- accel/accel.sh@21 -- # val= 00:06:17.457 09:29:40 -- accel/accel.sh@22 -- # case "$var" in 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # IFS=: 00:06:17.457 09:29:40 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@21 -- # val= 00:06:18.394 09:29:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # IFS=: 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@21 -- # val= 00:06:18.394 09:29:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # IFS=: 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@21 -- # val= 00:06:18.394 09:29:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # IFS=: 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@21 -- # val= 00:06:18.394 09:29:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # IFS=: 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@21 -- # val= 00:06:18.394 09:29:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # IFS=: 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@21 -- # val= 00:06:18.394 09:29:41 -- accel/accel.sh@22 -- # case "$var" in 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # IFS=: 00:06:18.394 09:29:41 -- accel/accel.sh@20 -- # read -r var val 00:06:18.394 09:29:41 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:18.394 09:29:41 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:18.394 09:29:41 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:18.394 00:06:18.394 real 0m2.655s 00:06:18.394 user 0m2.416s 00:06:18.394 sys 0m0.233s 00:06:18.394 09:29:41 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.394 09:29:41 -- common/autotest_common.sh@10 -- # set +x 00:06:18.394 ************************************ 00:06:18.394 END TEST accel_decmop_full 00:06:18.394 ************************************ 00:06:18.394 09:29:41 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.394 09:29:41 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:18.394 09:29:41 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.394 09:29:41 -- common/autotest_common.sh@10 -- # set +x 00:06:18.394 ************************************ 00:06:18.394 START TEST accel_decomp_mcore 00:06:18.394 ************************************ 00:06:18.394 09:29:41 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.394 09:29:41 -- accel/accel.sh@16 -- # local accel_opc 00:06:18.394 09:29:41 -- accel/accel.sh@17 -- # local accel_module 00:06:18.394 09:29:41 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.394 09:29:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:18.394 09:29:41 -- accel/accel.sh@12 -- # build_accel_config 00:06:18.394 09:29:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:18.394 09:29:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:18.394 09:29:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:18.394 09:29:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:18.394 09:29:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:18.394 09:29:41 -- accel/accel.sh@41 -- # local IFS=, 00:06:18.394 09:29:41 -- accel/accel.sh@42 -- # jq -r . 00:06:18.394 [2024-11-29 09:29:41.233690] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:18.394 [2024-11-29 09:29:41.233777] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174804 ] 00:06:18.654 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.654 [2024-11-29 09:29:41.303558] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:18.654 [2024-11-29 09:29:41.382521] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:18.654 [2024-11-29 09:29:41.382537] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:18.654 [2024-11-29 09:29:41.382554] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:18.654 [2024-11-29 09:29:41.382556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.033 09:29:42 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:20.033 00:06:20.033 SPDK Configuration: 00:06:20.033 Core mask: 0xf 00:06:20.033 00:06:20.033 Accel Perf Configuration: 00:06:20.033 Workload Type: decompress 00:06:20.033 Transfer size: 4096 bytes 00:06:20.033 Vector count 1 00:06:20.033 Module: software 00:06:20.033 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:20.033 Queue depth: 32 00:06:20.033 Allocate depth: 32 00:06:20.033 # threads/core: 1 00:06:20.033 Run time: 1 seconds 00:06:20.033 Verify: Yes 00:06:20.033 00:06:20.033 Running for 1 seconds... 00:06:20.033 00:06:20.033 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:20.033 ------------------------------------------------------------------------------------ 00:06:20.033 0,0 76352/s 140 MiB/s 0 0 00:06:20.033 3,0 76608/s 141 MiB/s 0 0 00:06:20.033 2,0 76448/s 140 MiB/s 0 0 00:06:20.033 1,0 76704/s 141 MiB/s 0 0 00:06:20.033 ==================================================================================== 00:06:20.033 Total 306112/s 1195 MiB/s 0 0' 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:20.033 09:29:42 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:06:20.033 09:29:42 -- accel/accel.sh@12 -- # build_accel_config 00:06:20.033 09:29:42 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:20.033 09:29:42 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:20.033 09:29:42 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:20.033 09:29:42 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:20.033 09:29:42 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:20.033 09:29:42 -- accel/accel.sh@41 -- # local IFS=, 00:06:20.033 09:29:42 -- accel/accel.sh@42 -- # jq -r . 00:06:20.033 [2024-11-29 09:29:42.578941] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:20.033 [2024-11-29 09:29:42.579034] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3174994 ] 00:06:20.033 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.033 [2024-11-29 09:29:42.648028] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:20.033 [2024-11-29 09:29:42.720325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:20.033 [2024-11-29 09:29:42.720419] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:20.033 [2024-11-29 09:29:42.720506] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:20.033 [2024-11-29 09:29:42.720508] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val=0xf 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val=decompress 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val=software 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@23 -- # accel_module=software 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.033 09:29:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:20.033 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.033 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val=32 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val=32 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val=1 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val=Yes 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:20.034 09:29:42 -- accel/accel.sh@21 -- # val= 00:06:20.034 09:29:42 -- accel/accel.sh@22 -- # case "$var" in 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # IFS=: 00:06:20.034 09:29:42 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@21 -- # val= 00:06:21.412 09:29:43 -- accel/accel.sh@22 -- # case "$var" in 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # IFS=: 00:06:21.412 09:29:43 -- accel/accel.sh@20 -- # read -r var val 00:06:21.412 09:29:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:21.412 09:29:43 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:21.412 09:29:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:21.412 00:06:21.412 real 0m2.693s 00:06:21.412 user 0m9.077s 00:06:21.412 sys 0m0.278s 00:06:21.412 09:29:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:21.412 09:29:43 -- common/autotest_common.sh@10 -- # set +x 00:06:21.412 ************************************ 00:06:21.412 END TEST accel_decomp_mcore 00:06:21.412 ************************************ 00:06:21.412 09:29:43 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:21.412 09:29:43 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:21.412 09:29:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.412 09:29:43 -- common/autotest_common.sh@10 -- # set +x 00:06:21.412 ************************************ 00:06:21.412 START TEST accel_decomp_full_mcore 00:06:21.412 ************************************ 00:06:21.412 09:29:43 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:21.413 09:29:43 -- accel/accel.sh@16 -- # local accel_opc 00:06:21.413 09:29:43 -- accel/accel.sh@17 -- # local accel_module 00:06:21.413 09:29:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:21.413 09:29:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:21.413 09:29:43 -- accel/accel.sh@12 -- # build_accel_config 00:06:21.413 09:29:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:21.413 09:29:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:21.413 09:29:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:21.413 09:29:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:21.413 09:29:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:21.413 09:29:43 -- accel/accel.sh@41 -- # local IFS=, 00:06:21.413 09:29:43 -- accel/accel.sh@42 -- # jq -r . 00:06:21.413 [2024-11-29 09:29:43.976692] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:21.413 [2024-11-29 09:29:43.976790] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3175204 ] 00:06:21.413 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.413 [2024-11-29 09:29:44.045209] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:21.413 [2024-11-29 09:29:44.117562] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.413 [2024-11-29 09:29:44.117648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:21.413 [2024-11-29 09:29:44.117694] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:21.413 [2024-11-29 09:29:44.117696] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.791 09:29:45 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:22.791 00:06:22.791 SPDK Configuration: 00:06:22.791 Core mask: 0xf 00:06:22.791 00:06:22.791 Accel Perf Configuration: 00:06:22.791 Workload Type: decompress 00:06:22.791 Transfer size: 111250 bytes 00:06:22.791 Vector count 1 00:06:22.791 Module: software 00:06:22.791 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:22.791 Queue depth: 32 00:06:22.791 Allocate depth: 32 00:06:22.791 # threads/core: 1 00:06:22.791 Run time: 1 seconds 00:06:22.791 Verify: Yes 00:06:22.791 00:06:22.791 Running for 1 seconds... 00:06:22.791 00:06:22.791 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:22.791 ------------------------------------------------------------------------------------ 00:06:22.791 0,0 5792/s 239 MiB/s 0 0 00:06:22.791 3,0 5824/s 240 MiB/s 0 0 00:06:22.791 2,0 5824/s 240 MiB/s 0 0 00:06:22.791 1,0 5824/s 240 MiB/s 0 0 00:06:22.791 ==================================================================================== 00:06:22.791 Total 23264/s 2468 MiB/s 0 0' 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:22.791 09:29:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:06:22.791 09:29:45 -- accel/accel.sh@12 -- # build_accel_config 00:06:22.791 09:29:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.791 09:29:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.791 09:29:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.791 09:29:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.791 09:29:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.791 09:29:45 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.791 09:29:45 -- accel/accel.sh@42 -- # jq -r . 00:06:22.791 [2024-11-29 09:29:45.323806] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:22.791 [2024-11-29 09:29:45.323888] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3175456 ] 00:06:22.791 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.791 [2024-11-29 09:29:45.392633] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:22.791 [2024-11-29 09:29:45.463882] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:22.791 [2024-11-29 09:29:45.463978] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:22.791 [2024-11-29 09:29:45.464048] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:22.791 [2024-11-29 09:29:45.464050] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val=0xf 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val=decompress 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:22.791 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.791 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.791 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val=software 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@23 -- # accel_module=software 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val=32 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val=32 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val=1 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val=Yes 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:22.792 09:29:45 -- accel/accel.sh@21 -- # val= 00:06:22.792 09:29:45 -- accel/accel.sh@22 -- # case "$var" in 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # IFS=: 00:06:22.792 09:29:45 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@21 -- # val= 00:06:24.173 09:29:46 -- accel/accel.sh@22 -- # case "$var" in 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # IFS=: 00:06:24.173 09:29:46 -- accel/accel.sh@20 -- # read -r var val 00:06:24.173 09:29:46 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:24.173 09:29:46 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:24.173 09:29:46 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:24.173 00:06:24.173 real 0m2.701s 00:06:24.173 user 0m9.120s 00:06:24.173 sys 0m0.284s 00:06:24.173 09:29:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.173 09:29:46 -- common/autotest_common.sh@10 -- # set +x 00:06:24.173 ************************************ 00:06:24.173 END TEST accel_decomp_full_mcore 00:06:24.173 ************************************ 00:06:24.173 09:29:46 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:24.173 09:29:46 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:06:24.173 09:29:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.173 09:29:46 -- common/autotest_common.sh@10 -- # set +x 00:06:24.173 ************************************ 00:06:24.173 START TEST accel_decomp_mthread 00:06:24.173 ************************************ 00:06:24.173 09:29:46 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:24.173 09:29:46 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.173 09:29:46 -- accel/accel.sh@17 -- # local accel_module 00:06:24.173 09:29:46 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:24.173 09:29:46 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:24.173 09:29:46 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.173 09:29:46 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.173 09:29:46 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.173 09:29:46 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.173 09:29:46 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.173 09:29:46 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.173 09:29:46 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.173 09:29:46 -- accel/accel.sh@42 -- # jq -r . 00:06:24.173 [2024-11-29 09:29:46.726514] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.173 [2024-11-29 09:29:46.726616] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3175746 ] 00:06:24.173 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.173 [2024-11-29 09:29:46.795339] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.173 [2024-11-29 09:29:46.864851] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.554 09:29:48 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:25.554 00:06:25.554 SPDK Configuration: 00:06:25.554 Core mask: 0x1 00:06:25.554 00:06:25.554 Accel Perf Configuration: 00:06:25.554 Workload Type: decompress 00:06:25.554 Transfer size: 4096 bytes 00:06:25.554 Vector count 1 00:06:25.554 Module: software 00:06:25.554 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:25.554 Queue depth: 32 00:06:25.554 Allocate depth: 32 00:06:25.554 # threads/core: 2 00:06:25.554 Run time: 1 seconds 00:06:25.554 Verify: Yes 00:06:25.554 00:06:25.554 Running for 1 seconds... 00:06:25.554 00:06:25.554 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:25.554 ------------------------------------------------------------------------------------ 00:06:25.554 0,1 45696/s 84 MiB/s 0 0 00:06:25.554 0,0 45600/s 84 MiB/s 0 0 00:06:25.554 ==================================================================================== 00:06:25.554 Total 91296/s 356 MiB/s 0 0' 00:06:25.554 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.554 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.554 09:29:48 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:25.554 09:29:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:06:25.554 09:29:48 -- accel/accel.sh@12 -- # build_accel_config 00:06:25.554 09:29:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:25.554 09:29:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:25.554 09:29:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:25.554 09:29:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:25.554 09:29:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:25.554 09:29:48 -- accel/accel.sh@41 -- # local IFS=, 00:06:25.554 09:29:48 -- accel/accel.sh@42 -- # jq -r . 00:06:25.554 [2024-11-29 09:29:48.057073] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:25.555 [2024-11-29 09:29:48.057164] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176012 ] 00:06:25.555 EAL: No free 2048 kB hugepages reported on node 1 00:06:25.555 [2024-11-29 09:29:48.127421] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.555 [2024-11-29 09:29:48.196561] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=0x1 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=decompress 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=software 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@23 -- # accel_module=software 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=32 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=32 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=2 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val=Yes 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:25.555 09:29:48 -- accel/accel.sh@21 -- # val= 00:06:25.555 09:29:48 -- accel/accel.sh@22 -- # case "$var" in 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # IFS=: 00:06:25.555 09:29:48 -- accel/accel.sh@20 -- # read -r var val 00:06:26.935 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.935 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.935 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.935 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.935 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.935 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.935 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.935 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.935 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.935 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.935 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.935 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.936 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.936 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.936 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.936 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.936 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.936 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.936 09:29:49 -- accel/accel.sh@21 -- # val= 00:06:26.936 09:29:49 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # IFS=: 00:06:26.936 09:29:49 -- accel/accel.sh@20 -- # read -r var val 00:06:26.936 09:29:49 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:26.936 09:29:49 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:26.936 09:29:49 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:26.936 00:06:26.936 real 0m2.671s 00:06:26.936 user 0m2.417s 00:06:26.936 sys 0m0.264s 00:06:26.936 09:29:49 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:26.936 09:29:49 -- common/autotest_common.sh@10 -- # set +x 00:06:26.936 ************************************ 00:06:26.936 END TEST accel_decomp_mthread 00:06:26.936 ************************************ 00:06:26.936 09:29:49 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:26.936 09:29:49 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:26.936 09:29:49 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:26.936 09:29:49 -- common/autotest_common.sh@10 -- # set +x 00:06:26.936 ************************************ 00:06:26.936 START TEST accel_deomp_full_mthread 00:06:26.936 ************************************ 00:06:26.936 09:29:49 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:26.936 09:29:49 -- accel/accel.sh@16 -- # local accel_opc 00:06:26.936 09:29:49 -- accel/accel.sh@17 -- # local accel_module 00:06:26.936 09:29:49 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:26.936 09:29:49 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:26.936 09:29:49 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.936 09:29:49 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.936 09:29:49 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.936 09:29:49 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.936 09:29:49 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.936 09:29:49 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.936 09:29:49 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.936 09:29:49 -- accel/accel.sh@42 -- # jq -r . 00:06:26.936 [2024-11-29 09:29:49.447706] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:26.936 [2024-11-29 09:29:49.447789] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176301 ] 00:06:26.936 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.936 [2024-11-29 09:29:49.518540] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.936 [2024-11-29 09:29:49.587348] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.317 09:29:50 -- accel/accel.sh@18 -- # out='Preparing input file... 00:06:28.317 00:06:28.317 SPDK Configuration: 00:06:28.317 Core mask: 0x1 00:06:28.317 00:06:28.317 Accel Perf Configuration: 00:06:28.317 Workload Type: decompress 00:06:28.317 Transfer size: 111250 bytes 00:06:28.317 Vector count 1 00:06:28.317 Module: software 00:06:28.317 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:28.317 Queue depth: 32 00:06:28.317 Allocate depth: 32 00:06:28.317 # threads/core: 2 00:06:28.317 Run time: 1 seconds 00:06:28.317 Verify: Yes 00:06:28.317 00:06:28.317 Running for 1 seconds... 00:06:28.317 00:06:28.317 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:28.317 ------------------------------------------------------------------------------------ 00:06:28.317 0,1 2912/s 120 MiB/s 0 0 00:06:28.317 0,0 2880/s 118 MiB/s 0 0 00:06:28.317 ==================================================================================== 00:06:28.317 Total 5792/s 614 MiB/s 0 0' 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:28.317 09:29:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:06:28.317 09:29:50 -- accel/accel.sh@12 -- # build_accel_config 00:06:28.317 09:29:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:28.317 09:29:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:28.317 09:29:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:28.317 09:29:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:28.317 09:29:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:28.317 09:29:50 -- accel/accel.sh@41 -- # local IFS=, 00:06:28.317 09:29:50 -- accel/accel.sh@42 -- # jq -r . 00:06:28.317 [2024-11-29 09:29:50.798308] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:28.317 [2024-11-29 09:29:50.798398] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176573 ] 00:06:28.317 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.317 [2024-11-29 09:29:50.866294] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:28.317 [2024-11-29 09:29:50.934399] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=0x1 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=decompress 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@24 -- # accel_opc=decompress 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val='111250 bytes' 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=software 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@23 -- # accel_module=software 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=32 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=32 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=2 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val=Yes 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:28.317 09:29:50 -- accel/accel.sh@21 -- # val= 00:06:28.317 09:29:50 -- accel/accel.sh@22 -- # case "$var" in 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # IFS=: 00:06:28.317 09:29:50 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@21 -- # val= 00:06:29.697 09:29:52 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # IFS=: 00:06:29.697 09:29:52 -- accel/accel.sh@20 -- # read -r var val 00:06:29.697 09:29:52 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:29.697 09:29:52 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:06:29.697 09:29:52 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:29.697 00:06:29.697 real 0m2.707s 00:06:29.697 user 0m2.456s 00:06:29.697 sys 0m0.258s 00:06:29.697 09:29:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.697 09:29:52 -- common/autotest_common.sh@10 -- # set +x 00:06:29.697 ************************************ 00:06:29.697 END TEST accel_deomp_full_mthread 00:06:29.697 ************************************ 00:06:29.697 09:29:52 -- accel/accel.sh@116 -- # [[ n == y ]] 00:06:29.697 09:29:52 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:29.697 09:29:52 -- accel/accel.sh@129 -- # build_accel_config 00:06:29.697 09:29:52 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:06:29.697 09:29:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.697 09:29:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.697 09:29:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.697 09:29:52 -- common/autotest_common.sh@10 -- # set +x 00:06:29.697 09:29:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.697 09:29:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.697 09:29:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.697 09:29:52 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.697 09:29:52 -- accel/accel.sh@42 -- # jq -r . 00:06:29.697 ************************************ 00:06:29.697 START TEST accel_dif_functional_tests 00:06:29.697 ************************************ 00:06:29.697 09:29:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:06:29.697 [2024-11-29 09:29:52.204660] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.697 [2024-11-29 09:29:52.204752] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176857 ] 00:06:29.697 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.697 [2024-11-29 09:29:52.272492] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:29.697 [2024-11-29 09:29:52.342642] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.697 [2024-11-29 09:29:52.342740] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:29.697 [2024-11-29 09:29:52.342742] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.697 00:06:29.697 00:06:29.697 CUnit - A unit testing framework for C - Version 2.1-3 00:06:29.697 http://cunit.sourceforge.net/ 00:06:29.697 00:06:29.697 00:06:29.697 Suite: accel_dif 00:06:29.697 Test: verify: DIF generated, GUARD check ...passed 00:06:29.697 Test: verify: DIF generated, APPTAG check ...passed 00:06:29.697 Test: verify: DIF generated, REFTAG check ...passed 00:06:29.697 Test: verify: DIF not generated, GUARD check ...[2024-11-29 09:29:52.410907] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:29.697 [2024-11-29 09:29:52.410960] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:06:29.697 passed 00:06:29.697 Test: verify: DIF not generated, APPTAG check ...[2024-11-29 09:29:52.411011] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:29.697 [2024-11-29 09:29:52.411030] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:06:29.697 passed 00:06:29.697 Test: verify: DIF not generated, REFTAG check ...[2024-11-29 09:29:52.411050] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:29.697 [2024-11-29 09:29:52.411069] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:06:29.697 passed 00:06:29.697 Test: verify: APPTAG correct, APPTAG check ...passed 00:06:29.697 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-29 09:29:52.411114] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:06:29.697 passed 00:06:29.697 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:06:29.697 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:06:29.697 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:06:29.697 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-29 09:29:52.411214] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:06:29.697 passed 00:06:29.697 Test: generate copy: DIF generated, GUARD check ...passed 00:06:29.697 Test: generate copy: DIF generated, APTTAG check ...passed 00:06:29.697 Test: generate copy: DIF generated, REFTAG check ...passed 00:06:29.697 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:06:29.697 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:06:29.697 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:06:29.697 Test: generate copy: iovecs-len validate ...[2024-11-29 09:29:52.411399] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:06:29.697 passed 00:06:29.697 Test: generate copy: buffer alignment validate ...passed 00:06:29.697 00:06:29.697 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.697 suites 1 1 n/a 0 0 00:06:29.697 tests 20 20 20 0 0 00:06:29.697 asserts 204 204 204 0 n/a 00:06:29.697 00:06:29.697 Elapsed time = 0.000 seconds 00:06:29.957 00:06:29.957 real 0m0.388s 00:06:29.957 user 0m0.584s 00:06:29.957 sys 0m0.156s 00:06:29.957 09:29:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.957 09:29:52 -- common/autotest_common.sh@10 -- # set +x 00:06:29.957 ************************************ 00:06:29.957 END TEST accel_dif_functional_tests 00:06:29.957 ************************************ 00:06:29.957 00:06:29.957 real 0m57.032s 00:06:29.957 user 1m4.669s 00:06:29.957 sys 0m7.033s 00:06:29.957 09:29:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:29.957 09:29:52 -- common/autotest_common.sh@10 -- # set +x 00:06:29.957 ************************************ 00:06:29.957 END TEST accel 00:06:29.957 ************************************ 00:06:29.957 09:29:52 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:29.957 09:29:52 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:29.957 09:29:52 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:29.957 09:29:52 -- common/autotest_common.sh@10 -- # set +x 00:06:29.957 ************************************ 00:06:29.957 START TEST accel_rpc 00:06:29.957 ************************************ 00:06:29.957 09:29:52 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:06:29.957 * Looking for test storage... 00:06:29.957 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:29.957 09:29:52 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:29.957 09:29:52 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:29.957 09:29:52 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:29.957 09:29:52 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:29.957 09:29:52 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:29.957 09:29:52 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:29.957 09:29:52 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:29.957 09:29:52 -- scripts/common.sh@335 -- # IFS=.-: 00:06:29.957 09:29:52 -- scripts/common.sh@335 -- # read -ra ver1 00:06:29.957 09:29:52 -- scripts/common.sh@336 -- # IFS=.-: 00:06:29.957 09:29:52 -- scripts/common.sh@336 -- # read -ra ver2 00:06:29.957 09:29:52 -- scripts/common.sh@337 -- # local 'op=<' 00:06:29.957 09:29:52 -- scripts/common.sh@339 -- # ver1_l=2 00:06:29.957 09:29:52 -- scripts/common.sh@340 -- # ver2_l=1 00:06:29.957 09:29:52 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:29.957 09:29:52 -- scripts/common.sh@343 -- # case "$op" in 00:06:29.957 09:29:52 -- scripts/common.sh@344 -- # : 1 00:06:29.957 09:29:52 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:29.957 09:29:52 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:29.957 09:29:52 -- scripts/common.sh@364 -- # decimal 1 00:06:29.957 09:29:52 -- scripts/common.sh@352 -- # local d=1 00:06:29.958 09:29:52 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:29.958 09:29:52 -- scripts/common.sh@354 -- # echo 1 00:06:30.217 09:29:52 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:30.217 09:29:52 -- scripts/common.sh@365 -- # decimal 2 00:06:30.217 09:29:52 -- scripts/common.sh@352 -- # local d=2 00:06:30.217 09:29:52 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:30.217 09:29:52 -- scripts/common.sh@354 -- # echo 2 00:06:30.217 09:29:52 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:30.217 09:29:52 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:30.217 09:29:52 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:30.217 09:29:52 -- scripts/common.sh@367 -- # return 0 00:06:30.217 09:29:52 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:30.217 09:29:52 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:30.217 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.217 --rc genhtml_branch_coverage=1 00:06:30.217 --rc genhtml_function_coverage=1 00:06:30.217 --rc genhtml_legend=1 00:06:30.217 --rc geninfo_all_blocks=1 00:06:30.217 --rc geninfo_unexecuted_blocks=1 00:06:30.217 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.217 ' 00:06:30.217 09:29:52 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:30.217 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.217 --rc genhtml_branch_coverage=1 00:06:30.217 --rc genhtml_function_coverage=1 00:06:30.217 --rc genhtml_legend=1 00:06:30.217 --rc geninfo_all_blocks=1 00:06:30.217 --rc geninfo_unexecuted_blocks=1 00:06:30.217 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.217 ' 00:06:30.217 09:29:52 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:30.217 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.217 --rc genhtml_branch_coverage=1 00:06:30.217 --rc genhtml_function_coverage=1 00:06:30.217 --rc genhtml_legend=1 00:06:30.217 --rc geninfo_all_blocks=1 00:06:30.217 --rc geninfo_unexecuted_blocks=1 00:06:30.217 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.217 ' 00:06:30.217 09:29:52 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:30.217 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.217 --rc genhtml_branch_coverage=1 00:06:30.217 --rc genhtml_function_coverage=1 00:06:30.217 --rc genhtml_legend=1 00:06:30.217 --rc geninfo_all_blocks=1 00:06:30.217 --rc geninfo_unexecuted_blocks=1 00:06:30.217 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.217 ' 00:06:30.217 09:29:52 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:30.217 09:29:52 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3176937 00:06:30.217 09:29:52 -- accel/accel_rpc.sh@15 -- # waitforlisten 3176937 00:06:30.217 09:29:52 -- common/autotest_common.sh@829 -- # '[' -z 3176937 ']' 00:06:30.217 09:29:52 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.217 09:29:52 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:30.217 09:29:52 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.217 09:29:52 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:30.217 09:29:52 -- common/autotest_common.sh@10 -- # set +x 00:06:30.217 09:29:52 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:06:30.217 [2024-11-29 09:29:52.831494] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:30.217 [2024-11-29 09:29:52.831588] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3176937 ] 00:06:30.217 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.217 [2024-11-29 09:29:52.899362] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.217 [2024-11-29 09:29:52.975112] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:30.217 [2024-11-29 09:29:52.975226] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.153 09:29:53 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:31.153 09:29:53 -- common/autotest_common.sh@862 -- # return 0 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:06:31.153 09:29:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:31.153 09:29:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.153 09:29:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.153 ************************************ 00:06:31.153 START TEST accel_assign_opcode 00:06:31.153 ************************************ 00:06:31.153 09:29:53 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:06:31.153 09:29:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.153 09:29:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.153 [2024-11-29 09:29:53.653233] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:06:31.153 09:29:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:06:31.153 09:29:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.153 09:29:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.153 [2024-11-29 09:29:53.661247] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:06:31.153 09:29:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:06:31.153 09:29:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.153 09:29:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.153 09:29:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:06:31.153 09:29:53 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:31.153 09:29:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:06:31.153 09:29:53 -- accel/accel_rpc.sh@42 -- # grep software 00:06:31.153 09:29:53 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:31.153 software 00:06:31.153 00:06:31.153 real 0m0.231s 00:06:31.153 user 0m0.039s 00:06:31.153 sys 0m0.011s 00:06:31.153 09:29:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.153 09:29:53 -- common/autotest_common.sh@10 -- # set +x 00:06:31.154 ************************************ 00:06:31.154 END TEST accel_assign_opcode 00:06:31.154 ************************************ 00:06:31.154 09:29:53 -- accel/accel_rpc.sh@55 -- # killprocess 3176937 00:06:31.154 09:29:53 -- common/autotest_common.sh@936 -- # '[' -z 3176937 ']' 00:06:31.154 09:29:53 -- common/autotest_common.sh@940 -- # kill -0 3176937 00:06:31.154 09:29:53 -- common/autotest_common.sh@941 -- # uname 00:06:31.154 09:29:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:31.154 09:29:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3176937 00:06:31.154 09:29:53 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:31.154 09:29:53 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:31.154 09:29:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3176937' 00:06:31.154 killing process with pid 3176937 00:06:31.154 09:29:53 -- common/autotest_common.sh@955 -- # kill 3176937 00:06:31.154 09:29:53 -- common/autotest_common.sh@960 -- # wait 3176937 00:06:31.722 00:06:31.722 real 0m1.619s 00:06:31.722 user 0m1.638s 00:06:31.722 sys 0m0.452s 00:06:31.722 09:29:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:31.722 09:29:54 -- common/autotest_common.sh@10 -- # set +x 00:06:31.722 ************************************ 00:06:31.722 END TEST accel_rpc 00:06:31.722 ************************************ 00:06:31.722 09:29:54 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:31.722 09:29:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:31.722 09:29:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:31.722 09:29:54 -- common/autotest_common.sh@10 -- # set +x 00:06:31.722 ************************************ 00:06:31.722 START TEST app_cmdline 00:06:31.722 ************************************ 00:06:31.722 09:29:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:06:31.722 * Looking for test storage... 00:06:31.722 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:31.722 09:29:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:31.722 09:29:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:31.722 09:29:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:31.722 09:29:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:31.722 09:29:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:31.722 09:29:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:31.722 09:29:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:31.722 09:29:54 -- scripts/common.sh@335 -- # IFS=.-: 00:06:31.722 09:29:54 -- scripts/common.sh@335 -- # read -ra ver1 00:06:31.722 09:29:54 -- scripts/common.sh@336 -- # IFS=.-: 00:06:31.722 09:29:54 -- scripts/common.sh@336 -- # read -ra ver2 00:06:31.722 09:29:54 -- scripts/common.sh@337 -- # local 'op=<' 00:06:31.722 09:29:54 -- scripts/common.sh@339 -- # ver1_l=2 00:06:31.722 09:29:54 -- scripts/common.sh@340 -- # ver2_l=1 00:06:31.722 09:29:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:31.722 09:29:54 -- scripts/common.sh@343 -- # case "$op" in 00:06:31.722 09:29:54 -- scripts/common.sh@344 -- # : 1 00:06:31.722 09:29:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:31.722 09:29:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:31.722 09:29:54 -- scripts/common.sh@364 -- # decimal 1 00:06:31.722 09:29:54 -- scripts/common.sh@352 -- # local d=1 00:06:31.722 09:29:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:31.722 09:29:54 -- scripts/common.sh@354 -- # echo 1 00:06:31.722 09:29:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:31.722 09:29:54 -- scripts/common.sh@365 -- # decimal 2 00:06:31.722 09:29:54 -- scripts/common.sh@352 -- # local d=2 00:06:31.722 09:29:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:31.722 09:29:54 -- scripts/common.sh@354 -- # echo 2 00:06:31.722 09:29:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:31.722 09:29:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:31.722 09:29:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:31.722 09:29:54 -- scripts/common.sh@367 -- # return 0 00:06:31.722 09:29:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:31.722 09:29:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:31.722 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.722 --rc genhtml_branch_coverage=1 00:06:31.722 --rc genhtml_function_coverage=1 00:06:31.722 --rc genhtml_legend=1 00:06:31.722 --rc geninfo_all_blocks=1 00:06:31.722 --rc geninfo_unexecuted_blocks=1 00:06:31.722 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.722 ' 00:06:31.722 09:29:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:31.722 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.722 --rc genhtml_branch_coverage=1 00:06:31.722 --rc genhtml_function_coverage=1 00:06:31.722 --rc genhtml_legend=1 00:06:31.722 --rc geninfo_all_blocks=1 00:06:31.722 --rc geninfo_unexecuted_blocks=1 00:06:31.722 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.722 ' 00:06:31.722 09:29:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:31.722 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.722 --rc genhtml_branch_coverage=1 00:06:31.722 --rc genhtml_function_coverage=1 00:06:31.722 --rc genhtml_legend=1 00:06:31.722 --rc geninfo_all_blocks=1 00:06:31.722 --rc geninfo_unexecuted_blocks=1 00:06:31.722 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.722 ' 00:06:31.722 09:29:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:31.722 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:31.722 --rc genhtml_branch_coverage=1 00:06:31.722 --rc genhtml_function_coverage=1 00:06:31.722 --rc genhtml_legend=1 00:06:31.722 --rc geninfo_all_blocks=1 00:06:31.722 --rc geninfo_unexecuted_blocks=1 00:06:31.722 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:31.722 ' 00:06:31.722 09:29:54 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:31.722 09:29:54 -- app/cmdline.sh@17 -- # spdk_tgt_pid=3177282 00:06:31.722 09:29:54 -- app/cmdline.sh@18 -- # waitforlisten 3177282 00:06:31.722 09:29:54 -- common/autotest_common.sh@829 -- # '[' -z 3177282 ']' 00:06:31.723 09:29:54 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:31.723 09:29:54 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:31.723 09:29:54 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:31.723 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:31.723 09:29:54 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:31.723 09:29:54 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:31.723 09:29:54 -- common/autotest_common.sh@10 -- # set +x 00:06:31.723 [2024-11-29 09:29:54.505338] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:31.723 [2024-11-29 09:29:54.505421] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3177282 ] 00:06:31.723 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.982 [2024-11-29 09:29:54.574380] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.982 [2024-11-29 09:29:54.649714] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:31.982 [2024-11-29 09:29:54.649827] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:32.918 09:29:55 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:32.918 09:29:55 -- common/autotest_common.sh@862 -- # return 0 00:06:32.918 09:29:55 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:06:32.918 { 00:06:32.918 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:06:32.918 "fields": { 00:06:32.918 "major": 24, 00:06:32.918 "minor": 1, 00:06:32.918 "patch": 1, 00:06:32.918 "suffix": "-pre", 00:06:32.918 "commit": "c13c99a5e" 00:06:32.918 } 00:06:32.918 } 00:06:32.918 09:29:55 -- app/cmdline.sh@22 -- # expected_methods=() 00:06:32.918 09:29:55 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:32.918 09:29:55 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:32.918 09:29:55 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:32.918 09:29:55 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:32.918 09:29:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:32.918 09:29:55 -- common/autotest_common.sh@10 -- # set +x 00:06:32.918 09:29:55 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:32.918 09:29:55 -- app/cmdline.sh@26 -- # sort 00:06:32.918 09:29:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:32.918 09:29:55 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:32.918 09:29:55 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:32.918 09:29:55 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:32.918 09:29:55 -- common/autotest_common.sh@650 -- # local es=0 00:06:32.918 09:29:55 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:32.918 09:29:55 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.918 09:29:55 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.918 09:29:55 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.918 09:29:55 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.918 09:29:55 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.918 09:29:55 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:32.918 09:29:55 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:06:32.918 09:29:55 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:06:32.918 09:29:55 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:33.177 request: 00:06:33.177 { 00:06:33.177 "method": "env_dpdk_get_mem_stats", 00:06:33.177 "req_id": 1 00:06:33.177 } 00:06:33.177 Got JSON-RPC error response 00:06:33.177 response: 00:06:33.177 { 00:06:33.177 "code": -32601, 00:06:33.177 "message": "Method not found" 00:06:33.177 } 00:06:33.177 09:29:55 -- common/autotest_common.sh@653 -- # es=1 00:06:33.177 09:29:55 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:33.177 09:29:55 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:33.177 09:29:55 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:33.177 09:29:55 -- app/cmdline.sh@1 -- # killprocess 3177282 00:06:33.177 09:29:55 -- common/autotest_common.sh@936 -- # '[' -z 3177282 ']' 00:06:33.177 09:29:55 -- common/autotest_common.sh@940 -- # kill -0 3177282 00:06:33.177 09:29:55 -- common/autotest_common.sh@941 -- # uname 00:06:33.177 09:29:55 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:33.177 09:29:55 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3177282 00:06:33.177 09:29:55 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:33.177 09:29:55 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:33.177 09:29:55 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3177282' 00:06:33.177 killing process with pid 3177282 00:06:33.177 09:29:55 -- common/autotest_common.sh@955 -- # kill 3177282 00:06:33.177 09:29:55 -- common/autotest_common.sh@960 -- # wait 3177282 00:06:33.476 00:06:33.476 real 0m1.834s 00:06:33.476 user 0m2.169s 00:06:33.476 sys 0m0.512s 00:06:33.476 09:29:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.476 09:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:33.476 ************************************ 00:06:33.476 END TEST app_cmdline 00:06:33.476 ************************************ 00:06:33.476 09:29:56 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:33.476 09:29:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.476 09:29:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.476 09:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:33.476 ************************************ 00:06:33.476 START TEST version 00:06:33.476 ************************************ 00:06:33.476 09:29:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:06:33.476 * Looking for test storage... 00:06:33.476 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:33.476 09:29:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:33.476 09:29:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:33.476 09:29:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:33.803 09:29:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:33.803 09:29:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:33.803 09:29:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:33.803 09:29:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:33.803 09:29:56 -- scripts/common.sh@335 -- # IFS=.-: 00:06:33.803 09:29:56 -- scripts/common.sh@335 -- # read -ra ver1 00:06:33.803 09:29:56 -- scripts/common.sh@336 -- # IFS=.-: 00:06:33.803 09:29:56 -- scripts/common.sh@336 -- # read -ra ver2 00:06:33.803 09:29:56 -- scripts/common.sh@337 -- # local 'op=<' 00:06:33.803 09:29:56 -- scripts/common.sh@339 -- # ver1_l=2 00:06:33.803 09:29:56 -- scripts/common.sh@340 -- # ver2_l=1 00:06:33.803 09:29:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:33.803 09:29:56 -- scripts/common.sh@343 -- # case "$op" in 00:06:33.803 09:29:56 -- scripts/common.sh@344 -- # : 1 00:06:33.803 09:29:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:33.803 09:29:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:33.803 09:29:56 -- scripts/common.sh@364 -- # decimal 1 00:06:33.803 09:29:56 -- scripts/common.sh@352 -- # local d=1 00:06:33.803 09:29:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:33.803 09:29:56 -- scripts/common.sh@354 -- # echo 1 00:06:33.803 09:29:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:33.803 09:29:56 -- scripts/common.sh@365 -- # decimal 2 00:06:33.803 09:29:56 -- scripts/common.sh@352 -- # local d=2 00:06:33.803 09:29:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:33.803 09:29:56 -- scripts/common.sh@354 -- # echo 2 00:06:33.803 09:29:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:33.803 09:29:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:33.803 09:29:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:33.803 09:29:56 -- scripts/common.sh@367 -- # return 0 00:06:33.803 09:29:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:33.803 09:29:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:33.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.803 --rc genhtml_branch_coverage=1 00:06:33.803 --rc genhtml_function_coverage=1 00:06:33.803 --rc genhtml_legend=1 00:06:33.803 --rc geninfo_all_blocks=1 00:06:33.803 --rc geninfo_unexecuted_blocks=1 00:06:33.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.803 ' 00:06:33.803 09:29:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:33.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.803 --rc genhtml_branch_coverage=1 00:06:33.803 --rc genhtml_function_coverage=1 00:06:33.803 --rc genhtml_legend=1 00:06:33.803 --rc geninfo_all_blocks=1 00:06:33.803 --rc geninfo_unexecuted_blocks=1 00:06:33.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.803 ' 00:06:33.803 09:29:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:33.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.803 --rc genhtml_branch_coverage=1 00:06:33.803 --rc genhtml_function_coverage=1 00:06:33.803 --rc genhtml_legend=1 00:06:33.803 --rc geninfo_all_blocks=1 00:06:33.803 --rc geninfo_unexecuted_blocks=1 00:06:33.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.803 ' 00:06:33.803 09:29:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:33.803 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:33.803 --rc genhtml_branch_coverage=1 00:06:33.803 --rc genhtml_function_coverage=1 00:06:33.803 --rc genhtml_legend=1 00:06:33.803 --rc geninfo_all_blocks=1 00:06:33.803 --rc geninfo_unexecuted_blocks=1 00:06:33.803 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:33.803 ' 00:06:33.803 09:29:56 -- app/version.sh@17 -- # get_header_version major 00:06:33.803 09:29:56 -- app/version.sh@14 -- # tr -d '"' 00:06:33.803 09:29:56 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.803 09:29:56 -- app/version.sh@14 -- # cut -f2 00:06:33.803 09:29:56 -- app/version.sh@17 -- # major=24 00:06:33.803 09:29:56 -- app/version.sh@18 -- # get_header_version minor 00:06:33.803 09:29:56 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.803 09:29:56 -- app/version.sh@14 -- # tr -d '"' 00:06:33.803 09:29:56 -- app/version.sh@14 -- # cut -f2 00:06:33.803 09:29:56 -- app/version.sh@18 -- # minor=1 00:06:33.803 09:29:56 -- app/version.sh@19 -- # get_header_version patch 00:06:33.803 09:29:56 -- app/version.sh@14 -- # tr -d '"' 00:06:33.803 09:29:56 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.803 09:29:56 -- app/version.sh@14 -- # cut -f2 00:06:33.803 09:29:56 -- app/version.sh@19 -- # patch=1 00:06:33.803 09:29:56 -- app/version.sh@20 -- # get_header_version suffix 00:06:33.803 09:29:56 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:06:33.803 09:29:56 -- app/version.sh@14 -- # cut -f2 00:06:33.803 09:29:56 -- app/version.sh@14 -- # tr -d '"' 00:06:33.803 09:29:56 -- app/version.sh@20 -- # suffix=-pre 00:06:33.803 09:29:56 -- app/version.sh@22 -- # version=24.1 00:06:33.803 09:29:56 -- app/version.sh@25 -- # (( patch != 0 )) 00:06:33.803 09:29:56 -- app/version.sh@25 -- # version=24.1.1 00:06:33.803 09:29:56 -- app/version.sh@28 -- # version=24.1.1rc0 00:06:33.803 09:29:56 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:33.803 09:29:56 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:33.803 09:29:56 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:06:33.803 09:29:56 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:06:33.803 00:06:33.803 real 0m0.265s 00:06:33.803 user 0m0.146s 00:06:33.803 sys 0m0.165s 00:06:33.803 09:29:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.803 09:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:33.803 ************************************ 00:06:33.803 END TEST version 00:06:33.803 ************************************ 00:06:33.803 09:29:56 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@191 -- # uname -s 00:06:33.803 09:29:56 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:06:33.803 09:29:56 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:06:33.803 09:29:56 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:06:33.803 09:29:56 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@255 -- # timing_exit lib 00:06:33.803 09:29:56 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:33.803 09:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:33.803 09:29:56 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:06:33.803 09:29:56 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:06:33.803 09:29:56 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:06:33.803 09:29:56 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:06:33.803 09:29:56 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:33.803 09:29:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:33.803 09:29:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.803 09:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:33.803 ************************************ 00:06:33.803 START TEST llvm_fuzz 00:06:33.803 ************************************ 00:06:33.803 09:29:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:06:34.064 * Looking for test storage... 00:06:34.064 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:06:34.064 09:29:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:34.064 09:29:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:34.064 09:29:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:34.064 09:29:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:34.064 09:29:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:34.064 09:29:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:34.064 09:29:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:34.064 09:29:56 -- scripts/common.sh@335 -- # IFS=.-: 00:06:34.064 09:29:56 -- scripts/common.sh@335 -- # read -ra ver1 00:06:34.064 09:29:56 -- scripts/common.sh@336 -- # IFS=.-: 00:06:34.064 09:29:56 -- scripts/common.sh@336 -- # read -ra ver2 00:06:34.064 09:29:56 -- scripts/common.sh@337 -- # local 'op=<' 00:06:34.064 09:29:56 -- scripts/common.sh@339 -- # ver1_l=2 00:06:34.064 09:29:56 -- scripts/common.sh@340 -- # ver2_l=1 00:06:34.064 09:29:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:34.064 09:29:56 -- scripts/common.sh@343 -- # case "$op" in 00:06:34.064 09:29:56 -- scripts/common.sh@344 -- # : 1 00:06:34.064 09:29:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:34.064 09:29:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:34.064 09:29:56 -- scripts/common.sh@364 -- # decimal 1 00:06:34.064 09:29:56 -- scripts/common.sh@352 -- # local d=1 00:06:34.064 09:29:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:34.064 09:29:56 -- scripts/common.sh@354 -- # echo 1 00:06:34.064 09:29:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:34.064 09:29:56 -- scripts/common.sh@365 -- # decimal 2 00:06:34.064 09:29:56 -- scripts/common.sh@352 -- # local d=2 00:06:34.064 09:29:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:34.064 09:29:56 -- scripts/common.sh@354 -- # echo 2 00:06:34.064 09:29:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:34.064 09:29:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:34.064 09:29:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:34.064 09:29:56 -- scripts/common.sh@367 -- # return 0 00:06:34.064 09:29:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:34.064 09:29:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:34.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.064 --rc genhtml_branch_coverage=1 00:06:34.064 --rc genhtml_function_coverage=1 00:06:34.064 --rc genhtml_legend=1 00:06:34.064 --rc geninfo_all_blocks=1 00:06:34.064 --rc geninfo_unexecuted_blocks=1 00:06:34.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.064 ' 00:06:34.064 09:29:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:34.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.064 --rc genhtml_branch_coverage=1 00:06:34.064 --rc genhtml_function_coverage=1 00:06:34.064 --rc genhtml_legend=1 00:06:34.064 --rc geninfo_all_blocks=1 00:06:34.064 --rc geninfo_unexecuted_blocks=1 00:06:34.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.064 ' 00:06:34.064 09:29:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:34.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.064 --rc genhtml_branch_coverage=1 00:06:34.064 --rc genhtml_function_coverage=1 00:06:34.064 --rc genhtml_legend=1 00:06:34.064 --rc geninfo_all_blocks=1 00:06:34.064 --rc geninfo_unexecuted_blocks=1 00:06:34.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.064 ' 00:06:34.064 09:29:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:34.064 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.064 --rc genhtml_branch_coverage=1 00:06:34.064 --rc genhtml_function_coverage=1 00:06:34.064 --rc genhtml_legend=1 00:06:34.064 --rc geninfo_all_blocks=1 00:06:34.064 --rc geninfo_unexecuted_blocks=1 00:06:34.064 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.064 ' 00:06:34.064 09:29:56 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:06:34.064 09:29:56 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:06:34.064 09:29:56 -- common/autotest_common.sh@548 -- # fuzzers=() 00:06:34.064 09:29:56 -- common/autotest_common.sh@548 -- # local fuzzers 00:06:34.064 09:29:56 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:06:34.064 09:29:56 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:06:34.064 09:29:56 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:06:34.064 09:29:56 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:06:34.064 09:29:56 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:06:34.064 09:29:56 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:06:34.064 09:29:56 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:34.064 09:29:56 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:34.064 09:29:56 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:34.064 09:29:56 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:34.064 09:29:56 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:06:34.064 09:29:56 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:06:34.064 09:29:56 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:34.064 09:29:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:34.064 09:29:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:34.064 09:29:56 -- common/autotest_common.sh@10 -- # set +x 00:06:34.064 ************************************ 00:06:34.064 START TEST nvmf_fuzz 00:06:34.064 ************************************ 00:06:34.064 09:29:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:06:34.064 * Looking for test storage... 00:06:34.064 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:34.064 09:29:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:34.064 09:29:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:34.064 09:29:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:34.326 09:29:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:34.326 09:29:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:34.326 09:29:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:34.326 09:29:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:34.326 09:29:56 -- scripts/common.sh@335 -- # IFS=.-: 00:06:34.326 09:29:56 -- scripts/common.sh@335 -- # read -ra ver1 00:06:34.326 09:29:56 -- scripts/common.sh@336 -- # IFS=.-: 00:06:34.326 09:29:56 -- scripts/common.sh@336 -- # read -ra ver2 00:06:34.326 09:29:56 -- scripts/common.sh@337 -- # local 'op=<' 00:06:34.326 09:29:56 -- scripts/common.sh@339 -- # ver1_l=2 00:06:34.326 09:29:56 -- scripts/common.sh@340 -- # ver2_l=1 00:06:34.326 09:29:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:34.326 09:29:56 -- scripts/common.sh@343 -- # case "$op" in 00:06:34.326 09:29:56 -- scripts/common.sh@344 -- # : 1 00:06:34.326 09:29:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:34.326 09:29:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:34.326 09:29:56 -- scripts/common.sh@364 -- # decimal 1 00:06:34.326 09:29:56 -- scripts/common.sh@352 -- # local d=1 00:06:34.326 09:29:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:34.326 09:29:56 -- scripts/common.sh@354 -- # echo 1 00:06:34.326 09:29:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:34.326 09:29:56 -- scripts/common.sh@365 -- # decimal 2 00:06:34.326 09:29:56 -- scripts/common.sh@352 -- # local d=2 00:06:34.326 09:29:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:34.326 09:29:56 -- scripts/common.sh@354 -- # echo 2 00:06:34.326 09:29:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:34.326 09:29:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:34.326 09:29:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:34.326 09:29:56 -- scripts/common.sh@367 -- # return 0 00:06:34.326 09:29:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:34.326 09:29:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 09:29:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 09:29:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 09:29:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:34.326 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.326 --rc genhtml_branch_coverage=1 00:06:34.326 --rc genhtml_function_coverage=1 00:06:34.326 --rc genhtml_legend=1 00:06:34.326 --rc geninfo_all_blocks=1 00:06:34.326 --rc geninfo_unexecuted_blocks=1 00:06:34.326 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.326 ' 00:06:34.326 09:29:56 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:06:34.326 09:29:56 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:06:34.326 09:29:56 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:06:34.326 09:29:56 -- common/autotest_common.sh@34 -- # set -e 00:06:34.326 09:29:56 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:06:34.326 09:29:56 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:06:34.326 09:29:56 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:06:34.326 09:29:56 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:06:34.326 09:29:56 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:06:34.326 09:29:56 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:06:34.326 09:29:56 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:06:34.326 09:29:56 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:06:34.326 09:29:56 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:06:34.327 09:29:56 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:06:34.327 09:29:56 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:06:34.327 09:29:56 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:06:34.327 09:29:56 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:06:34.327 09:29:56 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:06:34.327 09:29:56 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:06:34.327 09:29:56 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:06:34.327 09:29:56 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:06:34.327 09:29:56 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:06:34.327 09:29:56 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:06:34.327 09:29:56 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:06:34.327 09:29:56 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:06:34.327 09:29:56 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:06:34.327 09:29:56 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:34.327 09:29:56 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:06:34.327 09:29:56 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:06:34.327 09:29:56 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:06:34.327 09:29:56 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:06:34.327 09:29:56 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:06:34.327 09:29:56 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:06:34.327 09:29:56 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:06:34.327 09:29:56 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:06:34.327 09:29:56 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:06:34.327 09:29:56 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:06:34.327 09:29:56 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:06:34.327 09:29:56 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:06:34.327 09:29:56 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:06:34.327 09:29:56 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:06:34.327 09:29:56 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:06:34.327 09:29:56 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:06:34.327 09:29:56 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:34.327 09:29:56 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:06:34.327 09:29:56 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:06:34.327 09:29:56 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:06:34.327 09:29:56 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:06:34.327 09:29:56 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:06:34.327 09:29:56 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:06:34.327 09:29:56 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:06:34.327 09:29:56 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:06:34.327 09:29:56 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:06:34.327 09:29:56 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:06:34.327 09:29:56 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:06:34.327 09:29:56 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:06:34.327 09:29:56 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:06:34.327 09:29:56 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:06:34.327 09:29:56 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:06:34.327 09:29:56 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:06:34.327 09:29:56 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:06:34.327 09:29:56 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:06:34.327 09:29:56 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:06:34.327 09:29:56 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:06:34.327 09:29:56 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:06:34.327 09:29:56 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:06:34.327 09:29:56 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:06:34.327 09:29:56 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:06:34.327 09:29:56 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:06:34.327 09:29:56 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:06:34.327 09:29:56 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:06:34.327 09:29:56 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:06:34.327 09:29:56 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:06:34.327 09:29:56 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:06:34.327 09:29:56 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:06:34.327 09:29:56 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:06:34.327 09:29:56 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:06:34.327 09:29:56 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:06:34.327 09:29:56 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:06:34.327 09:29:56 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:06:34.327 09:29:56 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:06:34.327 09:29:56 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:06:34.327 09:29:56 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:06:34.327 09:29:56 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:06:34.327 09:29:56 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:06:34.327 09:29:56 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:06:34.327 09:29:56 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:06:34.327 09:29:56 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:34.327 09:29:56 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:06:34.327 09:29:56 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:34.327 09:29:56 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:06:34.327 09:29:56 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:34.327 09:29:56 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:34.327 09:29:56 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:06:34.327 09:29:56 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:34.327 09:29:56 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:06:34.327 09:29:56 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:06:34.327 09:29:56 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:06:34.327 09:29:56 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:06:34.327 09:29:56 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:06:34.327 09:29:56 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:06:34.327 09:29:56 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:06:34.327 09:29:56 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:06:34.327 #define SPDK_CONFIG_H 00:06:34.327 #define SPDK_CONFIG_APPS 1 00:06:34.327 #define SPDK_CONFIG_ARCH native 00:06:34.327 #undef SPDK_CONFIG_ASAN 00:06:34.327 #undef SPDK_CONFIG_AVAHI 00:06:34.327 #undef SPDK_CONFIG_CET 00:06:34.327 #define SPDK_CONFIG_COVERAGE 1 00:06:34.327 #define SPDK_CONFIG_CROSS_PREFIX 00:06:34.327 #undef SPDK_CONFIG_CRYPTO 00:06:34.327 #undef SPDK_CONFIG_CRYPTO_MLX5 00:06:34.327 #undef SPDK_CONFIG_CUSTOMOCF 00:06:34.327 #undef SPDK_CONFIG_DAOS 00:06:34.327 #define SPDK_CONFIG_DAOS_DIR 00:06:34.327 #define SPDK_CONFIG_DEBUG 1 00:06:34.327 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:06:34.327 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:06:34.327 #define SPDK_CONFIG_DPDK_INC_DIR 00:06:34.327 #define SPDK_CONFIG_DPDK_LIB_DIR 00:06:34.327 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:06:34.327 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:06:34.327 #define SPDK_CONFIG_EXAMPLES 1 00:06:34.327 #undef SPDK_CONFIG_FC 00:06:34.327 #define SPDK_CONFIG_FC_PATH 00:06:34.327 #define SPDK_CONFIG_FIO_PLUGIN 1 00:06:34.327 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:06:34.327 #undef SPDK_CONFIG_FUSE 00:06:34.327 #define SPDK_CONFIG_FUZZER 1 00:06:34.327 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:06:34.327 #undef SPDK_CONFIG_GOLANG 00:06:34.327 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:06:34.327 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:06:34.327 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:06:34.327 #undef SPDK_CONFIG_HAVE_LIBBSD 00:06:34.327 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:06:34.327 #define SPDK_CONFIG_IDXD 1 00:06:34.327 #define SPDK_CONFIG_IDXD_KERNEL 1 00:06:34.327 #undef SPDK_CONFIG_IPSEC_MB 00:06:34.327 #define SPDK_CONFIG_IPSEC_MB_DIR 00:06:34.327 #define SPDK_CONFIG_ISAL 1 00:06:34.327 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:06:34.327 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:06:34.327 #define SPDK_CONFIG_LIBDIR 00:06:34.327 #undef SPDK_CONFIG_LTO 00:06:34.327 #define SPDK_CONFIG_MAX_LCORES 00:06:34.327 #define SPDK_CONFIG_NVME_CUSE 1 00:06:34.327 #undef SPDK_CONFIG_OCF 00:06:34.327 #define SPDK_CONFIG_OCF_PATH 00:06:34.327 #define SPDK_CONFIG_OPENSSL_PATH 00:06:34.327 #undef SPDK_CONFIG_PGO_CAPTURE 00:06:34.327 #undef SPDK_CONFIG_PGO_USE 00:06:34.327 #define SPDK_CONFIG_PREFIX /usr/local 00:06:34.327 #undef SPDK_CONFIG_RAID5F 00:06:34.327 #undef SPDK_CONFIG_RBD 00:06:34.327 #define SPDK_CONFIG_RDMA 1 00:06:34.327 #define SPDK_CONFIG_RDMA_PROV verbs 00:06:34.327 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:06:34.327 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:06:34.327 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:06:34.327 #undef SPDK_CONFIG_SHARED 00:06:34.327 #undef SPDK_CONFIG_SMA 00:06:34.327 #define SPDK_CONFIG_TESTS 1 00:06:34.327 #undef SPDK_CONFIG_TSAN 00:06:34.327 #define SPDK_CONFIG_UBLK 1 00:06:34.327 #define SPDK_CONFIG_UBSAN 1 00:06:34.327 #undef SPDK_CONFIG_UNIT_TESTS 00:06:34.327 #undef SPDK_CONFIG_URING 00:06:34.327 #define SPDK_CONFIG_URING_PATH 00:06:34.327 #undef SPDK_CONFIG_URING_ZNS 00:06:34.327 #undef SPDK_CONFIG_USDT 00:06:34.327 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:06:34.327 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:06:34.327 #define SPDK_CONFIG_VFIO_USER 1 00:06:34.327 #define SPDK_CONFIG_VFIO_USER_DIR 00:06:34.327 #define SPDK_CONFIG_VHOST 1 00:06:34.327 #define SPDK_CONFIG_VIRTIO 1 00:06:34.327 #undef SPDK_CONFIG_VTUNE 00:06:34.327 #define SPDK_CONFIG_VTUNE_DIR 00:06:34.327 #define SPDK_CONFIG_WERROR 1 00:06:34.328 #define SPDK_CONFIG_WPDK_DIR 00:06:34.328 #undef SPDK_CONFIG_XNVME 00:06:34.328 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:06:34.328 09:29:56 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:06:34.328 09:29:56 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:34.328 09:29:56 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:34.328 09:29:56 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:34.328 09:29:56 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:34.328 09:29:56 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.328 09:29:56 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.328 09:29:56 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.328 09:29:56 -- paths/export.sh@5 -- # export PATH 00:06:34.328 09:29:56 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:34.328 09:29:56 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:34.328 09:29:56 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:06:34.328 09:29:56 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:34.328 09:29:56 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:06:34.328 09:29:56 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:06:34.328 09:29:56 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:06:34.328 09:29:56 -- pm/common@16 -- # TEST_TAG=N/A 00:06:34.328 09:29:56 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:06:34.328 09:29:56 -- common/autotest_common.sh@52 -- # : 1 00:06:34.328 09:29:56 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:06:34.328 09:29:56 -- common/autotest_common.sh@56 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:06:34.328 09:29:56 -- common/autotest_common.sh@58 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:06:34.328 09:29:56 -- common/autotest_common.sh@60 -- # : 1 00:06:34.328 09:29:56 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:06:34.328 09:29:56 -- common/autotest_common.sh@62 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:06:34.328 09:29:56 -- common/autotest_common.sh@64 -- # : 00:06:34.328 09:29:56 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:06:34.328 09:29:56 -- common/autotest_common.sh@66 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:06:34.328 09:29:56 -- common/autotest_common.sh@68 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:06:34.328 09:29:56 -- common/autotest_common.sh@70 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:06:34.328 09:29:56 -- common/autotest_common.sh@72 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:06:34.328 09:29:56 -- common/autotest_common.sh@74 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:06:34.328 09:29:56 -- common/autotest_common.sh@76 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:06:34.328 09:29:56 -- common/autotest_common.sh@78 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:06:34.328 09:29:56 -- common/autotest_common.sh@80 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:06:34.328 09:29:56 -- common/autotest_common.sh@82 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:06:34.328 09:29:56 -- common/autotest_common.sh@84 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:06:34.328 09:29:56 -- common/autotest_common.sh@86 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:06:34.328 09:29:56 -- common/autotest_common.sh@88 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:06:34.328 09:29:56 -- common/autotest_common.sh@90 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:06:34.328 09:29:56 -- common/autotest_common.sh@92 -- # : 1 00:06:34.328 09:29:56 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:06:34.328 09:29:56 -- common/autotest_common.sh@94 -- # : 1 00:06:34.328 09:29:56 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:06:34.328 09:29:56 -- common/autotest_common.sh@96 -- # : rdma 00:06:34.328 09:29:56 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:06:34.328 09:29:56 -- common/autotest_common.sh@98 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:06:34.328 09:29:56 -- common/autotest_common.sh@100 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:06:34.328 09:29:56 -- common/autotest_common.sh@102 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:06:34.328 09:29:56 -- common/autotest_common.sh@104 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:06:34.328 09:29:56 -- common/autotest_common.sh@106 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:06:34.328 09:29:56 -- common/autotest_common.sh@108 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:06:34.328 09:29:56 -- common/autotest_common.sh@110 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:06:34.328 09:29:56 -- common/autotest_common.sh@112 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:06:34.328 09:29:56 -- common/autotest_common.sh@114 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:06:34.328 09:29:56 -- common/autotest_common.sh@116 -- # : 1 00:06:34.328 09:29:56 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:06:34.328 09:29:56 -- common/autotest_common.sh@118 -- # : 00:06:34.328 09:29:56 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:06:34.328 09:29:56 -- common/autotest_common.sh@120 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:06:34.328 09:29:56 -- common/autotest_common.sh@122 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:06:34.328 09:29:56 -- common/autotest_common.sh@124 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:06:34.328 09:29:56 -- common/autotest_common.sh@126 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:06:34.328 09:29:56 -- common/autotest_common.sh@128 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:06:34.328 09:29:56 -- common/autotest_common.sh@130 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:06:34.328 09:29:56 -- common/autotest_common.sh@132 -- # : 00:06:34.328 09:29:56 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:06:34.328 09:29:56 -- common/autotest_common.sh@134 -- # : true 00:06:34.328 09:29:56 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:06:34.328 09:29:56 -- common/autotest_common.sh@136 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:06:34.328 09:29:56 -- common/autotest_common.sh@138 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:06:34.328 09:29:56 -- common/autotest_common.sh@140 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:06:34.328 09:29:56 -- common/autotest_common.sh@142 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:06:34.328 09:29:56 -- common/autotest_common.sh@144 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:06:34.328 09:29:56 -- common/autotest_common.sh@146 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:06:34.328 09:29:56 -- common/autotest_common.sh@148 -- # : 00:06:34.328 09:29:56 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:06:34.328 09:29:56 -- common/autotest_common.sh@150 -- # : 0 00:06:34.328 09:29:56 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:06:34.328 09:29:56 -- common/autotest_common.sh@152 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:06:34.328 09:29:57 -- common/autotest_common.sh@154 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:06:34.328 09:29:57 -- common/autotest_common.sh@156 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:06:34.328 09:29:57 -- common/autotest_common.sh@158 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:06:34.328 09:29:57 -- common/autotest_common.sh@160 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:06:34.328 09:29:57 -- common/autotest_common.sh@163 -- # : 00:06:34.328 09:29:57 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:06:34.328 09:29:57 -- common/autotest_common.sh@165 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:06:34.328 09:29:57 -- common/autotest_common.sh@167 -- # : 0 00:06:34.328 09:29:57 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:06:34.329 09:29:57 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:06:34.329 09:29:57 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:06:34.329 09:29:57 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:06:34.329 09:29:57 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:34.329 09:29:57 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:06:34.329 09:29:57 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:06:34.329 09:29:57 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:06:34.329 09:29:57 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:34.329 09:29:57 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:06:34.329 09:29:57 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:34.329 09:29:57 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:06:34.329 09:29:57 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:06:34.329 09:29:57 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:06:34.329 09:29:57 -- common/autotest_common.sh@196 -- # cat 00:06:34.329 09:29:57 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:06:34.329 09:29:57 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:34.329 09:29:57 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:06:34.329 09:29:57 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:34.329 09:29:57 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:06:34.329 09:29:57 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:06:34.329 09:29:57 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:06:34.329 09:29:57 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:34.329 09:29:57 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:06:34.329 09:29:57 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:34.329 09:29:57 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:06:34.329 09:29:57 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:34.329 09:29:57 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:06:34.329 09:29:57 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:34.329 09:29:57 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:06:34.329 09:29:57 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:34.329 09:29:57 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:06:34.329 09:29:57 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:34.329 09:29:57 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:06:34.329 09:29:57 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:06:34.329 09:29:57 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:06:34.329 09:29:57 -- common/autotest_common.sh@249 -- # _LCOV= 00:06:34.329 09:29:57 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@250 -- # _LCOV=1 00:06:34.329 09:29:57 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:06:34.329 09:29:57 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:06:34.329 09:29:57 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:06:34.329 09:29:57 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:06:34.329 09:29:57 -- common/autotest_common.sh@259 -- # export valgrind= 00:06:34.329 09:29:57 -- common/autotest_common.sh@259 -- # valgrind= 00:06:34.329 09:29:57 -- common/autotest_common.sh@265 -- # uname -s 00:06:34.329 09:29:57 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:06:34.329 09:29:57 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:06:34.329 09:29:57 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:06:34.329 09:29:57 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:06:34.329 09:29:57 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@275 -- # MAKE=make 00:06:34.329 09:29:57 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:06:34.329 09:29:57 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:06:34.329 09:29:57 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:06:34.329 09:29:57 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:06:34.329 09:29:57 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:06:34.329 09:29:57 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:06:34.329 09:29:57 -- common/autotest_common.sh@319 -- # [[ -z 3177985 ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@319 -- # kill -0 3177985 00:06:34.329 09:29:57 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:06:34.329 09:29:57 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:06:34.329 09:29:57 -- common/autotest_common.sh@332 -- # local mount target_dir 00:06:34.329 09:29:57 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:06:34.329 09:29:57 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:06:34.329 09:29:57 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:06:34.329 09:29:57 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:06:34.329 09:29:57 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.SgcofJ 00:06:34.329 09:29:57 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:06:34.329 09:29:57 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:06:34.329 09:29:57 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.SgcofJ/tests/nvmf /tmp/spdk.SgcofJ 00:06:34.329 09:29:57 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:06:34.329 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.329 09:29:57 -- common/autotest_common.sh@328 -- # df -T 00:06:34.329 09:29:57 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:06:34.329 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:06:34.329 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:06:34.329 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:06:34.329 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:06:34.329 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:06:34.329 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.329 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:06:34.329 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:06:34.329 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:06:34.329 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:06:34.329 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:06:34.329 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.329 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:06:34.329 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:06:34.329 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=53293043712 00:06:34.329 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:06:34.329 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=8437563392 00:06:34.330 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:06:34.330 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:06:34.330 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:06:34.330 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:06:34.330 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863421440 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:06:34.330 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=1884160 00:06:34.330 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:06:34.330 09:29:57 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:06:34.330 09:29:57 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:06:34.330 09:29:57 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:06:34.330 09:29:57 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:06:34.330 * Looking for test storage... 00:06:34.330 09:29:57 -- common/autotest_common.sh@369 -- # local target_space new_size 00:06:34.330 09:29:57 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:06:34.330 09:29:57 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:34.330 09:29:57 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:06:34.330 09:29:57 -- common/autotest_common.sh@373 -- # mount=/ 00:06:34.330 09:29:57 -- common/autotest_common.sh@375 -- # target_space=53293043712 00:06:34.330 09:29:57 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:06:34.330 09:29:57 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:06:34.330 09:29:57 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:06:34.330 09:29:57 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:06:34.330 09:29:57 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:06:34.330 09:29:57 -- common/autotest_common.sh@382 -- # new_size=10652155904 00:06:34.330 09:29:57 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:06:34.330 09:29:57 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:34.330 09:29:57 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:34.330 09:29:57 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:34.330 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:06:34.330 09:29:57 -- common/autotest_common.sh@390 -- # return 0 00:06:34.330 09:29:57 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:06:34.330 09:29:57 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:06:34.330 09:29:57 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:06:34.330 09:29:57 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:06:34.330 09:29:57 -- common/autotest_common.sh@1682 -- # true 00:06:34.330 09:29:57 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:06:34.330 09:29:57 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:06:34.330 09:29:57 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:06:34.330 09:29:57 -- common/autotest_common.sh@27 -- # exec 00:06:34.330 09:29:57 -- common/autotest_common.sh@29 -- # exec 00:06:34.330 09:29:57 -- common/autotest_common.sh@31 -- # xtrace_restore 00:06:34.330 09:29:57 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:06:34.330 09:29:57 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:06:34.330 09:29:57 -- common/autotest_common.sh@18 -- # set -x 00:06:34.330 09:29:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:34.330 09:29:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:34.330 09:29:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:34.330 09:29:57 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:34.330 09:29:57 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:34.330 09:29:57 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:34.330 09:29:57 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:34.330 09:29:57 -- scripts/common.sh@335 -- # IFS=.-: 00:06:34.330 09:29:57 -- scripts/common.sh@335 -- # read -ra ver1 00:06:34.330 09:29:57 -- scripts/common.sh@336 -- # IFS=.-: 00:06:34.330 09:29:57 -- scripts/common.sh@336 -- # read -ra ver2 00:06:34.330 09:29:57 -- scripts/common.sh@337 -- # local 'op=<' 00:06:34.330 09:29:57 -- scripts/common.sh@339 -- # ver1_l=2 00:06:34.330 09:29:57 -- scripts/common.sh@340 -- # ver2_l=1 00:06:34.330 09:29:57 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:34.330 09:29:57 -- scripts/common.sh@343 -- # case "$op" in 00:06:34.330 09:29:57 -- scripts/common.sh@344 -- # : 1 00:06:34.330 09:29:57 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:34.330 09:29:57 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:34.330 09:29:57 -- scripts/common.sh@364 -- # decimal 1 00:06:34.330 09:29:57 -- scripts/common.sh@352 -- # local d=1 00:06:34.330 09:29:57 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:34.330 09:29:57 -- scripts/common.sh@354 -- # echo 1 00:06:34.330 09:29:57 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:34.330 09:29:57 -- scripts/common.sh@365 -- # decimal 2 00:06:34.590 09:29:57 -- scripts/common.sh@352 -- # local d=2 00:06:34.590 09:29:57 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:34.590 09:29:57 -- scripts/common.sh@354 -- # echo 2 00:06:34.590 09:29:57 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:34.590 09:29:57 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:34.590 09:29:57 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:34.590 09:29:57 -- scripts/common.sh@367 -- # return 0 00:06:34.590 09:29:57 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:34.590 09:29:57 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:34.590 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.590 --rc genhtml_branch_coverage=1 00:06:34.590 --rc genhtml_function_coverage=1 00:06:34.590 --rc genhtml_legend=1 00:06:34.590 --rc geninfo_all_blocks=1 00:06:34.590 --rc geninfo_unexecuted_blocks=1 00:06:34.590 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.590 ' 00:06:34.590 09:29:57 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:34.590 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.590 --rc genhtml_branch_coverage=1 00:06:34.590 --rc genhtml_function_coverage=1 00:06:34.590 --rc genhtml_legend=1 00:06:34.590 --rc geninfo_all_blocks=1 00:06:34.590 --rc geninfo_unexecuted_blocks=1 00:06:34.590 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.590 ' 00:06:34.590 09:29:57 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:34.590 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.590 --rc genhtml_branch_coverage=1 00:06:34.590 --rc genhtml_function_coverage=1 00:06:34.590 --rc genhtml_legend=1 00:06:34.590 --rc geninfo_all_blocks=1 00:06:34.590 --rc geninfo_unexecuted_blocks=1 00:06:34.590 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.590 ' 00:06:34.590 09:29:57 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:34.590 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:34.590 --rc genhtml_branch_coverage=1 00:06:34.590 --rc genhtml_function_coverage=1 00:06:34.590 --rc genhtml_legend=1 00:06:34.590 --rc geninfo_all_blocks=1 00:06:34.590 --rc geninfo_unexecuted_blocks=1 00:06:34.590 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:34.590 ' 00:06:34.590 09:29:57 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:06:34.590 09:29:57 -- ../common.sh@8 -- # pids=() 00:06:34.590 09:29:57 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:34.590 09:29:57 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:06:34.590 09:29:57 -- nvmf/run.sh@56 -- # fuzz_num=25 00:06:34.590 09:29:57 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:06:34.590 09:29:57 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:06:34.590 09:29:57 -- nvmf/run.sh@61 -- # mem_size=512 00:06:34.590 09:29:57 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:06:34.590 09:29:57 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:06:34.590 09:29:57 -- ../common.sh@69 -- # local fuzz_num=25 00:06:34.590 09:29:57 -- ../common.sh@70 -- # local time=1 00:06:34.590 09:29:57 -- ../common.sh@72 -- # (( i = 0 )) 00:06:34.590 09:29:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:34.590 09:29:57 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:06:34.590 09:29:57 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:06:34.590 09:29:57 -- nvmf/run.sh@24 -- # local timen=1 00:06:34.590 09:29:57 -- nvmf/run.sh@25 -- # local core=0x1 00:06:34.590 09:29:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:34.590 09:29:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:06:34.590 09:29:57 -- nvmf/run.sh@29 -- # printf %02d 0 00:06:34.590 09:29:57 -- nvmf/run.sh@29 -- # port=4400 00:06:34.590 09:29:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:34.590 09:29:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:06:34.590 09:29:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:34.590 09:29:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:06:34.590 [2024-11-29 09:29:57.228935] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:34.590 [2024-11-29 09:29:57.229031] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3178042 ] 00:06:34.590 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.850 [2024-11-29 09:29:57.496123] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.850 [2024-11-29 09:29:57.583213] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:34.850 [2024-11-29 09:29:57.583344] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.850 [2024-11-29 09:29:57.641212] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:34.850 [2024-11-29 09:29:57.657590] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:06:34.850 INFO: Running with entropic power schedule (0xFF, 100). 00:06:34.850 INFO: Seed: 415498307 00:06:34.850 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:34.850 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:34.850 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:06:34.850 INFO: A corpus is not provided, starting from an empty corpus 00:06:34.850 #2 INITED exec/s: 0 rss: 60Mb 00:06:34.850 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:34.850 This may also happen if the target rejected all inputs we tried so far 00:06:35.109 [2024-11-29 09:29:57.712733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.109 [2024-11-29 09:29:57.712762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.368 NEW_FUNC[1/669]: 0x43a858 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:06:35.368 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:35.368 #21 NEW cov: 11526 ft: 11527 corp: 2/91b lim: 320 exec/s: 0 rss: 68Mb L: 90/90 MS: 4 ChangeByte-CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:06:35.368 [2024-11-29 09:29:58.013485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.368 [2024-11-29 09:29:58.013517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.368 #22 NEW cov: 11639 ft: 12053 corp: 3/181b lim: 320 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 CopyPart- 00:06:35.368 [2024-11-29 09:29:58.053555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.368 [2024-11-29 09:29:58.053582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.368 #28 NEW cov: 11645 ft: 12291 corp: 4/271b lim: 320 exec/s: 0 rss: 68Mb L: 90/90 MS: 1 ChangeByte- 00:06:35.368 [2024-11-29 09:29:58.093654] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:35.368 [2024-11-29 09:29:58.093683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.368 NEW_FUNC[1/1]: 0x16c34e8 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:06:35.368 #34 NEW cov: 11768 ft: 12626 corp: 5/378b lim: 320 exec/s: 0 rss: 68Mb L: 107/107 MS: 1 InsertRepeatedBytes- 00:06:35.368 [2024-11-29 09:29:58.133757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000040 cdw11:00000000 00:06:35.368 [2024-11-29 09:29:58.133782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.368 #35 NEW cov: 11768 ft: 12698 corp: 6/468b lim: 320 exec/s: 0 rss: 68Mb L: 90/107 MS: 1 ChangeBit- 00:06:35.368 [2024-11-29 09:29:58.173866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.368 [2024-11-29 09:29:58.173891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.368 #36 NEW cov: 11768 ft: 12775 corp: 7/555b lim: 320 exec/s: 0 rss: 68Mb L: 87/107 MS: 1 EraseBytes- 00:06:35.627 [2024-11-29 09:29:58.213965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:35.627 [2024-11-29 09:29:58.213991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 #39 NEW cov: 11768 ft: 12866 corp: 8/676b lim: 320 exec/s: 0 rss: 68Mb L: 121/121 MS: 3 EraseBytes-EraseBytes-InsertRepeatedBytes- 00:06:35.627 [2024-11-29 09:29:58.254125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:35.627 [2024-11-29 09:29:58.254151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 #40 NEW cov: 11768 ft: 13007 corp: 9/784b lim: 320 exec/s: 0 rss: 68Mb L: 108/121 MS: 1 InsertByte- 00:06:35.627 [2024-11-29 09:29:58.294438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:35.627 [2024-11-29 09:29:58.294463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 [2024-11-29 09:29:58.294526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:35.627 [2024-11-29 09:29:58.294540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.627 [2024-11-29 09:29:58.294616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:33333333 cdw11:33333333 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3333333333333333 00:06:35.627 [2024-11-29 09:29:58.294631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:35.627 NEW_FUNC[1/1]: 0x12c5228 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2016 00:06:35.627 #41 NEW cov: 11799 ft: 13380 corp: 10/984b lim: 320 exec/s: 0 rss: 69Mb L: 200/200 MS: 1 InsertRepeatedBytes- 00:06:35.627 [2024-11-29 09:29:58.344363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.627 [2024-11-29 09:29:58.344387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 #42 NEW cov: 11799 ft: 13451 corp: 11/1074b lim: 320 exec/s: 0 rss: 69Mb L: 90/200 MS: 1 ChangeBinInt- 00:06:35.627 [2024-11-29 09:29:58.384648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:35.627 [2024-11-29 09:29:58.384673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 [2024-11-29 09:29:58.384732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:35.627 [2024-11-29 09:29:58.384746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.627 [2024-11-29 09:29:58.384802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3030303030303030 00:06:35.627 [2024-11-29 09:29:58.384815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:35.627 #43 NEW cov: 11799 ft: 13465 corp: 12/1274b lim: 320 exec/s: 0 rss: 69Mb L: 200/200 MS: 1 ChangeASCIIInt- 00:06:35.627 [2024-11-29 09:29:58.424603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.627 [2024-11-29 09:29:58.424628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 #44 NEW cov: 11799 ft: 13494 corp: 13/1362b lim: 320 exec/s: 0 rss: 69Mb L: 88/200 MS: 1 EraseBytes- 00:06:35.627 [2024-11-29 09:29:58.464870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.627 [2024-11-29 09:29:58.464895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.627 [2024-11-29 09:29:58.464945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.627 [2024-11-29 09:29:58.464959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.886 #47 NEW cov: 11800 ft: 13665 corp: 14/1511b lim: 320 exec/s: 0 rss: 69Mb L: 149/200 MS: 3 EraseBytes-ChangeBinInt-CrossOver- 00:06:35.886 [2024-11-29 09:29:58.504833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.886 [2024-11-29 09:29:58.504858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.886 #48 NEW cov: 11800 ft: 13707 corp: 15/1584b lim: 320 exec/s: 0 rss: 69Mb L: 73/200 MS: 1 CrossOver- 00:06:35.886 [2024-11-29 09:29:58.535126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:35.886 [2024-11-29 09:29:58.535152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.886 [2024-11-29 09:29:58.535209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:35.886 [2024-11-29 09:29:58.535223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.886 [2024-11-29 09:29:58.535279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3030303030303030 00:06:35.886 [2024-11-29 09:29:58.535293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:35.886 #49 NEW cov: 11800 ft: 13739 corp: 16/1784b lim: 320 exec/s: 0 rss: 69Mb L: 200/200 MS: 1 ChangeByte- 00:06:35.886 [2024-11-29 09:29:58.575028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:35.886 [2024-11-29 09:29:58.575057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.886 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:35.886 #50 NEW cov: 11823 ft: 13783 corp: 17/1892b lim: 320 exec/s: 0 rss: 69Mb L: 108/200 MS: 1 ShuffleBytes- 00:06:35.886 [2024-11-29 09:29:58.615143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.886 [2024-11-29 09:29:58.615167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.886 #51 NEW cov: 11823 ft: 13802 corp: 18/1965b lim: 320 exec/s: 0 rss: 69Mb L: 73/200 MS: 1 ChangeBinInt- 00:06:35.886 [2024-11-29 09:29:58.655243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:93010000 00:06:35.886 [2024-11-29 09:29:58.655268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.886 #52 NEW cov: 11823 ft: 13887 corp: 19/2055b lim: 320 exec/s: 0 rss: 69Mb L: 90/200 MS: 1 CMP- DE: "\001\223\317\301\304\017\351f"- 00:06:35.886 [2024-11-29 09:29:58.695472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:35.886 [2024-11-29 09:29:58.695497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:35.886 [2024-11-29 09:29:58.695549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:5a5a5a5a cdw11:5a5a5a5a 00:06:35.886 [2024-11-29 09:29:58.695562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:35.886 #53 NEW cov: 11823 ft: 13946 corp: 20/2201b lim: 320 exec/s: 53 rss: 69Mb L: 146/200 MS: 1 InsertRepeatedBytes- 00:06:36.145 [2024-11-29 09:29:58.735634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:36.145 [2024-11-29 09:29:58.735660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.735744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.145 [2024-11-29 09:29:58.735758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.735817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:30303037 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3030303030303030 00:06:36.145 [2024-11-29 09:29:58.735830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.145 #54 NEW cov: 11823 ft: 13954 corp: 21/2401b lim: 320 exec/s: 54 rss: 69Mb L: 200/200 MS: 1 ChangeBinInt- 00:06:36.145 [2024-11-29 09:29:58.775765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:36.145 [2024-11-29 09:29:58.775791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.775851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.145 [2024-11-29 09:29:58.775864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.775923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:33333333 cdw11:33333333 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3333333333333333 00:06:36.145 [2024-11-29 09:29:58.775937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.145 #55 NEW cov: 11823 ft: 14009 corp: 22/2615b lim: 320 exec/s: 55 rss: 69Mb L: 214/214 MS: 1 CopyPart- 00:06:36.145 [2024-11-29 09:29:58.815804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000c000 00:06:36.145 [2024-11-29 09:29:58.815828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.815897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.145 [2024-11-29 09:29:58.815910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.145 #56 NEW cov: 11823 ft: 14023 corp: 23/2774b lim: 320 exec/s: 56 rss: 69Mb L: 159/214 MS: 1 CopyPart- 00:06:36.145 [2024-11-29 09:29:58.855796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.145 [2024-11-29 09:29:58.855821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 #57 NEW cov: 11823 ft: 14040 corp: 24/2864b lim: 320 exec/s: 57 rss: 69Mb L: 90/214 MS: 1 CMP- DE: "Oh\374\341\301\317\223\000"- 00:06:36.145 [2024-11-29 09:29:58.885910] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:36.145 [2024-11-29 09:29:58.885936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 #58 NEW cov: 11823 ft: 14066 corp: 25/2971b lim: 320 exec/s: 58 rss: 69Mb L: 107/214 MS: 1 ShuffleBytes- 00:06:36.145 [2024-11-29 09:29:58.926176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:e1fc684f cdw10:33333333 cdw11:33333333 00:06:36.145 [2024-11-29 09:29:58.926200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.926263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.145 [2024-11-29 09:29:58.926277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.145 [2024-11-29 09:29:58.926337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:30303030 cdw11:30303030 SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.145 [2024-11-29 09:29:58.926350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.145 #59 NEW cov: 11823 ft: 14085 corp: 26/3179b lim: 320 exec/s: 59 rss: 69Mb L: 208/214 MS: 1 PersAutoDict- DE: "Oh\374\341\301\317\223\000"- 00:06:36.145 [2024-11-29 09:29:58.966110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.145 [2024-11-29 09:29:58.966135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.145 #60 NEW cov: 11823 ft: 14097 corp: 27/3269b lim: 320 exec/s: 60 rss: 69Mb L: 90/214 MS: 1 CrossOver- 00:06:36.405 [2024-11-29 09:29:59.006183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00005900 cdw11:00000000 00:06:36.405 [2024-11-29 09:29:59.006207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 #61 NEW cov: 11823 ft: 14135 corp: 28/3359b lim: 320 exec/s: 61 rss: 69Mb L: 90/214 MS: 1 ChangeByte- 00:06:36.405 [2024-11-29 09:29:59.046306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000040 cdw11:00000000 00:06:36.405 [2024-11-29 09:29:59.046332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 #62 NEW cov: 11823 ft: 14147 corp: 29/3485b lim: 320 exec/s: 62 rss: 69Mb L: 126/214 MS: 1 CopyPart- 00:06:36.405 [2024-11-29 09:29:59.076764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:b2b2b2b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb2b2b2b2b2ffffff 00:06:36.405 [2024-11-29 09:29:59.076789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 [2024-11-29 09:29:59.076864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b2) qid:0 cid:5 nsid:b2b2b2b2 cdw10:b2b2b2b2 cdw11:b2b2b2b2 SGL TRANSPORT DATA BLOCK TRANSPORT 0xb2b2b2b2b2b2b2b2 00:06:36.405 [2024-11-29 09:29:59.076878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.405 [2024-11-29 09:29:59.076938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (b2) qid:0 cid:6 nsid:b2b2b2b2 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffb2b2b2 00:06:36.405 [2024-11-29 09:29:59.076951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.405 #63 NEW cov: 11823 ft: 14306 corp: 30/3721b lim: 320 exec/s: 63 rss: 69Mb L: 236/236 MS: 1 InsertRepeatedBytes- 00:06:36.405 [2024-11-29 09:29:59.116532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.405 [2024-11-29 09:29:59.116556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 #64 NEW cov: 11823 ft: 14315 corp: 31/3809b lim: 320 exec/s: 64 rss: 69Mb L: 88/236 MS: 1 PersAutoDict- DE: "\001\223\317\301\304\017\351f"- 00:06:36.405 [2024-11-29 09:29:59.156648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.405 [2024-11-29 09:29:59.156672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 #65 NEW cov: 11823 ft: 14346 corp: 32/3899b lim: 320 exec/s: 65 rss: 70Mb L: 90/236 MS: 1 ChangeBinInt- 00:06:36.405 [2024-11-29 09:29:59.196814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:36.405 [2024-11-29 09:29:59.196839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 #66 NEW cov: 11823 ft: 14354 corp: 33/4021b lim: 320 exec/s: 66 rss: 70Mb L: 122/236 MS: 1 InsertRepeatedBytes- 00:06:36.405 [2024-11-29 09:29:59.237062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:36.405 [2024-11-29 09:29:59.237087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.405 [2024-11-29 09:29:59.237148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.405 [2024-11-29 09:29:59.237162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.405 [2024-11-29 09:29:59.237223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:33333333 cdw11:33333333 SGL TRANSPORT DATA BLOCK TRANSPORT 0x33333333afafafaf 00:06:36.405 [2024-11-29 09:29:59.237236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.665 #67 NEW cov: 11823 ft: 14365 corp: 34/4225b lim: 320 exec/s: 67 rss: 70Mb L: 204/236 MS: 1 CMP- DE: "\000\000\000\000"- 00:06:36.665 [2024-11-29 09:29:59.277025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.665 [2024-11-29 09:29:59.277050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.665 #68 NEW cov: 11823 ft: 14381 corp: 35/4312b lim: 320 exec/s: 68 rss: 70Mb L: 87/236 MS: 1 ShuffleBytes- 00:06:36.665 [2024-11-29 09:29:59.317310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:0000c000 00:06:36.665 [2024-11-29 09:29:59.317336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.665 [2024-11-29 09:29:59.317389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.665 [2024-11-29 09:29:59.317402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.665 #74 NEW cov: 11823 ft: 14440 corp: 36/4471b lim: 320 exec/s: 74 rss: 70Mb L: 159/236 MS: 1 ShuffleBytes- 00:06:36.665 [2024-11-29 09:29:59.357484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:36.665 [2024-11-29 09:29:59.357510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.665 [2024-11-29 09:29:59.357573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.665 [2024-11-29 09:29:59.357587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.665 [2024-11-29 09:29:59.357635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:33333333 cdw11:33333333 SGL TRANSPORT DATA BLOCK TRANSPORT 0x33333333afafafaf 00:06:36.665 [2024-11-29 09:29:59.357649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.665 #75 NEW cov: 11823 ft: 14444 corp: 37/4675b lim: 320 exec/s: 75 rss: 70Mb L: 204/236 MS: 1 CopyPart- 00:06:36.665 [2024-11-29 09:29:59.397393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.665 [2024-11-29 09:29:59.397417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.665 #76 NEW cov: 11823 ft: 14458 corp: 38/4748b lim: 320 exec/s: 76 rss: 70Mb L: 73/236 MS: 1 ChangeBinInt- 00:06:36.665 [2024-11-29 09:29:59.437489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:36.665 [2024-11-29 09:29:59.437515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.665 #77 NEW cov: 11823 ft: 14489 corp: 39/4873b lim: 320 exec/s: 77 rss: 70Mb L: 125/236 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:06:36.665 [2024-11-29 09:29:59.477614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:36.665 [2024-11-29 09:29:59.477640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.665 #78 NEW cov: 11823 ft: 14493 corp: 40/4995b lim: 320 exec/s: 78 rss: 70Mb L: 122/236 MS: 1 PersAutoDict- DE: "Oh\374\341\301\317\223\000"- 00:06:36.925 [2024-11-29 09:29:59.517828] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:36.925 [2024-11-29 09:29:59.517855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.925 [2024-11-29 09:29:59.517916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:fdfdfdfd cdw11:fdfdfdfd SGL TRANSPORT DATA BLOCK TRANSPORT 0xfdfdfdfdfdfdfdfd 00:06:36.925 [2024-11-29 09:29:59.517929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.925 #79 NEW cov: 11823 ft: 14560 corp: 41/5179b lim: 320 exec/s: 79 rss: 70Mb L: 184/236 MS: 1 InsertRepeatedBytes- 00:06:36.925 [2024-11-29 09:29:59.557837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:0066e90f cdw11:00000000 00:06:36.925 [2024-11-29 09:29:59.557862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.925 #80 NEW cov: 11823 ft: 14572 corp: 42/5271b lim: 320 exec/s: 80 rss: 70Mb L: 92/236 MS: 1 CMP- DE: "\000\000\000\000"- 00:06:36.925 [2024-11-29 09:29:59.597951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:06:36.925 [2024-11-29 09:29:59.597977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.925 #81 NEW cov: 11823 ft: 14626 corp: 43/5354b lim: 320 exec/s: 81 rss: 70Mb L: 83/236 MS: 1 EraseBytes- 00:06:36.925 [2024-11-29 09:29:59.638020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:36.925 [2024-11-29 09:29:59.638044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.925 #82 NEW cov: 11823 ft: 14631 corp: 44/5443b lim: 320 exec/s: 82 rss: 70Mb L: 89/236 MS: 1 EraseBytes- 00:06:36.925 [2024-11-29 09:29:59.668372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c0) qid:0 cid:4 nsid:0 cdw10:33333333 cdw11:33333333 00:06:36.925 [2024-11-29 09:29:59.668397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:36.925 [2024-11-29 09:29:59.668459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (33) qid:0 cid:5 nsid:33333333 cdw10:afafafaf cdw11:afafafaf SGL TRANSPORT DATA BLOCK TRANSPORT 0xafafafafafafafaf 00:06:36.925 [2024-11-29 09:29:59.668473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:36.925 [2024-11-29 09:29:59.668546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (af) qid:0 cid:6 nsid:afafafaf cdw10:33333333 cdw11:33333333 SGL TRANSPORT DATA BLOCK TRANSPORT 0x3333333333333333 00:06:36.925 [2024-11-29 09:29:59.668560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:36.925 #83 NEW cov: 11823 ft: 14643 corp: 45/5657b lim: 320 exec/s: 41 rss: 70Mb L: 214/236 MS: 1 ChangeBit- 00:06:36.925 #83 DONE cov: 11823 ft: 14643 corp: 45/5657b lim: 320 exec/s: 41 rss: 70Mb 00:06:36.925 ###### Recommended dictionary. ###### 00:06:36.925 "\001\223\317\301\304\017\351f" # Uses: 1 00:06:36.925 "Oh\374\341\301\317\223\000" # Uses: 3 00:06:36.925 "\000\000\000\000" # Uses: 1 00:06:36.925 ###### End of recommended dictionary. ###### 00:06:36.925 Done 83 runs in 2 second(s) 00:06:37.185 09:29:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:06:37.185 09:29:59 -- ../common.sh@72 -- # (( i++ )) 00:06:37.185 09:29:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:37.185 09:29:59 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:06:37.185 09:29:59 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:06:37.185 09:29:59 -- nvmf/run.sh@24 -- # local timen=1 00:06:37.185 09:29:59 -- nvmf/run.sh@25 -- # local core=0x1 00:06:37.185 09:29:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:37.185 09:29:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:06:37.185 09:29:59 -- nvmf/run.sh@29 -- # printf %02d 1 00:06:37.185 09:29:59 -- nvmf/run.sh@29 -- # port=4401 00:06:37.185 09:29:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:37.185 09:29:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:06:37.185 09:29:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:37.185 09:29:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:06:37.185 [2024-11-29 09:29:59.862357] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.185 [2024-11-29 09:29:59.862430] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3178589 ] 00:06:37.185 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.444 [2024-11-29 09:30:00.134962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.444 [2024-11-29 09:30:00.220750] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:37.444 [2024-11-29 09:30:00.220887] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.444 [2024-11-29 09:30:00.279760] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:37.703 [2024-11-29 09:30:00.296208] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:06:37.703 INFO: Running with entropic power schedule (0xFF, 100). 00:06:37.703 INFO: Seed: 3052528790 00:06:37.703 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:37.703 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:37.703 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:06:37.703 INFO: A corpus is not provided, starting from an empty corpus 00:06:37.703 #2 INITED exec/s: 0 rss: 61Mb 00:06:37.704 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:37.704 This may also happen if the target rejected all inputs we tried so far 00:06:37.704 [2024-11-29 09:30:00.344421] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:37.704 [2024-11-29 09:30:00.344559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.704 [2024-11-29 09:30:00.344585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.963 NEW_FUNC[1/670]: 0x43b158 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:06:37.963 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:37.963 #8 NEW cov: 11633 ft: 11640 corp: 2/10b lim: 30 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CMP- DE: "p\000\000\000\000\000\000\000"- 00:06:37.963 [2024-11-29 09:30:00.685186] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000303 00:06:37.963 [2024-11-29 09:30:00.685272] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300000303 00:06:37.963 [2024-11-29 09:30:00.685388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a038303 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.963 [2024-11-29 09:30:00.685413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.963 [2024-11-29 09:30:00.685444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:03038303 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.963 [2024-11-29 09:30:00.685460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:37.963 NEW_FUNC[1/1]: 0x1298b98 in nvmf_tcp_poll_group_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:3336 00:06:37.963 #11 NEW cov: 11758 ft: 12542 corp: 3/22b lim: 30 exec/s: 0 rss: 68Mb L: 12/12 MS: 3 InsertByte-EraseBytes-InsertRepeatedBytes- 00:06:37.963 [2024-11-29 09:30:00.745196] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:37.963 [2024-11-29 09:30:00.745329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:37.963 [2024-11-29 09:30:00.745357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:37.963 #12 NEW cov: 11764 ft: 12804 corp: 4/31b lim: 30 exec/s: 0 rss: 68Mb L: 9/12 MS: 1 ShuffleBytes- 00:06:38.222 [2024-11-29 09:30:00.805400] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (272836) > buf size (4096) 00:06:38.222 [2024-11-29 09:30:00.805523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a708100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.222 [2024-11-29 09:30:00.805547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.222 #23 NEW cov: 11849 ft: 13233 corp: 5/41b lim: 30 exec/s: 0 rss: 69Mb L: 10/12 MS: 1 InsertByte- 00:06:38.222 [2024-11-29 09:30:00.855489] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (272836) > buf size (4096) 00:06:38.222 [2024-11-29 09:30:00.855636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a708100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.222 [2024-11-29 09:30:00.855662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.222 #24 NEW cov: 11849 ft: 13369 corp: 6/48b lim: 30 exec/s: 0 rss: 69Mb L: 7/12 MS: 1 EraseBytes- 00:06:38.222 [2024-11-29 09:30:00.915693] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273172) > buf size (4096) 00:06:38.222 [2024-11-29 09:30:00.915820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac48100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.222 [2024-11-29 09:30:00.915844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.222 #25 NEW cov: 11849 ft: 13469 corp: 7/55b lim: 30 exec/s: 0 rss: 69Mb L: 7/12 MS: 1 ChangeByte- 00:06:38.222 [2024-11-29 09:30:00.975859] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273172) > buf size (4096) 00:06:38.222 [2024-11-29 09:30:00.975978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac48100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.222 [2024-11-29 09:30:00.976002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.222 #26 NEW cov: 11849 ft: 13616 corp: 8/62b lim: 30 exec/s: 0 rss: 69Mb L: 7/12 MS: 1 ChangeBinInt- 00:06:38.222 [2024-11-29 09:30:01.046011] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:06:38.222 [2024-11-29 09:30:01.046143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000070 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.222 [2024-11-29 09:30:01.046166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.488 #27 NEW cov: 11849 ft: 13654 corp: 9/71b lim: 30 exec/s: 0 rss: 69Mb L: 9/12 MS: 1 ShuffleBytes- 00:06:38.488 [2024-11-29 09:30:01.106189] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (112) > len (4) 00:06:38.488 [2024-11-29 09:30:01.106309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.488 [2024-11-29 09:30:01.106332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.488 #28 NEW cov: 11862 ft: 13724 corp: 10/80b lim: 30 exec/s: 0 rss: 69Mb L: 9/12 MS: 1 ShuffleBytes- 00:06:38.488 [2024-11-29 09:30:01.156296] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (144) > len (4) 00:06:38.488 [2024-11-29 09:30:01.156415] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.488 [2024-11-29 09:30:01.156438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.488 #29 NEW cov: 11862 ft: 13745 corp: 11/89b lim: 30 exec/s: 0 rss: 69Mb L: 9/12 MS: 1 ChangeBinInt- 00:06:38.488 [2024-11-29 09:30:01.216530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.489 [2024-11-29 09:30:01.216560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.489 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:38.489 #30 NEW cov: 11889 ft: 13828 corp: 12/96b lim: 30 exec/s: 0 rss: 69Mb L: 7/12 MS: 1 EraseBytes- 00:06:38.489 [2024-11-29 09:30:01.266569] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (273172) > buf size (4096) 00:06:38.489 [2024-11-29 09:30:01.266757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac48100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.489 [2024-11-29 09:30:01.266782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.489 [2024-11-29 09:30:01.266813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.489 [2024-11-29 09:30:01.266829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.489 #31 NEW cov: 11889 ft: 13886 corp: 13/111b lim: 30 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:06:38.748 [2024-11-29 09:30:01.326834] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:38.748 [2024-11-29 09:30:01.326921] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:38.748 [2024-11-29 09:30:01.326980] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:38.748 [2024-11-29 09:30:01.327037] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (14396) > buf size (4096) 00:06:38.748 [2024-11-29 09:30:01.327144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:000e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.327165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.327196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.327211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.327239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.327254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.327281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:0e0e000e cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.327296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:38.748 #32 NEW cov: 11889 ft: 14479 corp: 14/138b lim: 30 exec/s: 32 rss: 69Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:06:38.748 [2024-11-29 09:30:01.396988] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c1c1 00:06:38.748 [2024-11-29 09:30:01.397063] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c1c1 00:06:38.748 [2024-11-29 09:30:01.397122] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (200708) > buf size (4096) 00:06:38.748 [2024-11-29 09:30:01.397228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.397253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.397284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c1c181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.397299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.397327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:c400005d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.397342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:38.748 #33 NEW cov: 11889 ft: 14758 corp: 15/156b lim: 30 exec/s: 33 rss: 69Mb L: 18/27 MS: 1 InsertRepeatedBytes- 00:06:38.748 [2024-11-29 09:30:01.447071] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11028) > buf size (4096) 00:06:38.748 [2024-11-29 09:30:01.447157] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x5d 00:06:38.748 [2024-11-29 09:30:01.447265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac40070 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.447286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.447317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.447332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.748 #34 NEW cov: 11889 ft: 14836 corp: 16/171b lim: 30 exec/s: 34 rss: 69Mb L: 15/27 MS: 1 PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:06:38.748 [2024-11-29 09:30:01.517263] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:38.748 [2024-11-29 09:30:01.517349] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:38.748 [2024-11-29 09:30:01.517462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8a0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.517484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:38.748 [2024-11-29 09:30:01.517514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.517530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:38.748 #37 NEW cov: 11889 ft: 14851 corp: 17/184b lim: 30 exec/s: 37 rss: 69Mb L: 13/27 MS: 3 ChangeBit-ShuffleBytes-CrossOver- 00:06:38.748 [2024-11-29 09:30:01.567372] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:38.748 [2024-11-29 09:30:01.567495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:38.748 [2024-11-29 09:30:01.567518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.008 #38 NEW cov: 11889 ft: 14909 corp: 18/193b lim: 30 exec/s: 38 rss: 69Mb L: 9/27 MS: 1 PersAutoDict- DE: "p\000\000\000\000\000\000\000"- 00:06:39.008 [2024-11-29 09:30:01.617485] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:39.008 [2024-11-29 09:30:01.617629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.617652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.008 #39 NEW cov: 11889 ft: 14960 corp: 19/202b lim: 30 exec/s: 39 rss: 69Mb L: 9/27 MS: 1 ChangeBit- 00:06:39.008 [2024-11-29 09:30:01.677737] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c152 00:06:39.008 [2024-11-29 09:30:01.677811] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c1c1 00:06:39.008 [2024-11-29 09:30:01.677870] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (200708) > buf size (4096) 00:06:39.008 [2024-11-29 09:30:01.677976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.677997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.008 [2024-11-29 09:30:01.678027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c1c181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.678042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.008 [2024-11-29 09:30:01.678070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:c400005d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.678085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.008 #40 NEW cov: 11889 ft: 14982 corp: 20/220b lim: 30 exec/s: 40 rss: 70Mb L: 18/27 MS: 1 ChangeByte- 00:06:39.008 [2024-11-29 09:30:01.747908] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d6d6 00:06:39.008 [2024-11-29 09:30:01.747995] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (744284) > buf size (4096) 00:06:39.008 [2024-11-29 09:30:01.748107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.748129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.008 [2024-11-29 09:30:01.748159] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d6d602d6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.748175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.008 #43 NEW cov: 11889 ft: 14993 corp: 21/233b lim: 30 exec/s: 43 rss: 70Mb L: 13/27 MS: 3 EraseBytes-ChangeBit-InsertRepeatedBytes- 00:06:39.008 [2024-11-29 09:30:01.818642] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:39.008 [2024-11-29 09:30:01.818776] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000e0e 00:06:39.008 [2024-11-29 09:30:01.818996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:8a0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.819026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.008 [2024-11-29 09:30:01.819061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0e0e020e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.008 [2024-11-29 09:30:01.819075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.008 #44 NEW cov: 11889 ft: 15036 corp: 22/246b lim: 30 exec/s: 44 rss: 70Mb L: 13/27 MS: 1 ShuffleBytes- 00:06:39.267 [2024-11-29 09:30:01.858706] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:39.267 [2024-11-29 09:30:01.858915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.858944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.267 #45 NEW cov: 11889 ft: 15088 corp: 23/255b lim: 30 exec/s: 45 rss: 70Mb L: 9/27 MS: 1 ChangeByte- 00:06:39.267 [2024-11-29 09:30:01.898896] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (272836) > buf size (4096) 00:06:39.267 [2024-11-29 09:30:01.899013] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009a9a 00:06:39.267 [2024-11-29 09:30:01.899117] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200009a9a 00:06:39.267 [2024-11-29 09:30:01.899332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a708100 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.899358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.267 [2024-11-29 09:30:01.899412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:009a029a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.899426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.267 [2024-11-29 09:30:01.899479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:9a9a029a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.899492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.267 #46 NEW cov: 11889 ft: 15118 corp: 24/277b lim: 30 exec/s: 46 rss: 70Mb L: 22/27 MS: 1 InsertRepeatedBytes- 00:06:39.267 [2024-11-29 09:30:01.938945] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (11028) > buf size (4096) 00:06:39.267 [2024-11-29 09:30:01.939061] ctrlr.c:2547:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (11008) > len (4) 00:06:39.267 [2024-11-29 09:30:01.939263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac40070 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.939287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.267 [2024-11-29 09:30:01.939341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.939355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.267 #47 NEW cov: 11889 ft: 15129 corp: 25/293b lim: 30 exec/s: 47 rss: 70Mb L: 16/27 MS: 1 InsertByte- 00:06:39.267 [2024-11-29 09:30:01.979047] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:06:39.267 [2024-11-29 09:30:01.979259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000070 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:01.979284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.267 #48 NEW cov: 11889 ft: 15138 corp: 26/302b lim: 30 exec/s: 48 rss: 70Mb L: 9/27 MS: 1 ChangeByte- 00:06:39.267 [2024-11-29 09:30:02.019158] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:06:39.267 [2024-11-29 09:30:02.019390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000023 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:02.019416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.267 #50 NEW cov: 11889 ft: 15155 corp: 27/308b lim: 30 exec/s: 50 rss: 70Mb L: 6/27 MS: 2 EraseBytes-InsertByte- 00:06:39.267 [2024-11-29 09:30:02.059267] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:39.267 [2024-11-29 09:30:02.059475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:02.059503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.267 #51 NEW cov: 11889 ft: 15169 corp: 28/317b lim: 30 exec/s: 51 rss: 70Mb L: 9/27 MS: 1 CopyPart- 00:06:39.267 [2024-11-29 09:30:02.099415] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c152 00:06:39.267 [2024-11-29 09:30:02.099532] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c1c1 00:06:39.267 [2024-11-29 09:30:02.099742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.267 [2024-11-29 09:30:02.099768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.268 [2024-11-29 09:30:02.099823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c1c181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.268 [2024-11-29 09:30:02.099838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.527 #52 NEW cov: 11889 ft: 15194 corp: 29/332b lim: 30 exec/s: 52 rss: 70Mb L: 15/27 MS: 1 CrossOver- 00:06:39.527 [2024-11-29 09:30:02.139608] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009191 00:06:39.527 [2024-11-29 09:30:02.139724] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100009191 00:06:39.527 [2024-11-29 09:30:02.140034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a918191 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.140060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.527 [2024-11-29 09:30:02.140116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:91918191 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.140129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.527 [2024-11-29 09:30:02.140184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.140197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.527 #53 NEW cov: 11889 ft: 15216 corp: 30/352b lim: 30 exec/s: 53 rss: 70Mb L: 20/27 MS: 1 InsertRepeatedBytes- 00:06:39.527 [2024-11-29 09:30:02.179593] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:06:39.527 [2024-11-29 09:30:02.179804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000070 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.179829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.527 #54 NEW cov: 11889 ft: 15233 corp: 31/361b lim: 30 exec/s: 54 rss: 70Mb L: 9/27 MS: 1 ShuffleBytes- 00:06:39.527 [2024-11-29 09:30:02.209723] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c152 00:06:39.527 [2024-11-29 09:30:02.209838] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000c1c1 00:06:39.527 [2024-11-29 09:30:02.210044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ac181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.210069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.527 [2024-11-29 09:30:02.210124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:c1c181c1 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.210141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.527 #55 NEW cov: 11896 ft: 15259 corp: 32/376b lim: 30 exec/s: 55 rss: 70Mb L: 15/27 MS: 1 ShuffleBytes- 00:06:39.527 [2024-11-29 09:30:02.249855] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (6188) > buf size (4096) 00:06:39.527 [2024-11-29 09:30:02.250068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:060a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.250093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.527 #56 NEW cov: 11896 ft: 15280 corp: 33/386b lim: 30 exec/s: 56 rss: 70Mb L: 10/27 MS: 1 InsertByte- 00:06:39.527 [2024-11-29 09:30:02.290029] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000dada 00:06:39.527 [2024-11-29 09:30:02.290143] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (748396) > buf size (4096) 00:06:39.527 [2024-11-29 09:30:02.290248] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d6d6 00:06:39.527 [2024-11-29 09:30:02.290457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:08da02da cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.290482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.527 [2024-11-29 09:30:02.290538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:dada02da cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.290552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:39.527 [2024-11-29 09:30:02.290611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00d602d6 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.290640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:39.527 #57 NEW cov: 11896 ft: 15289 corp: 34/409b lim: 30 exec/s: 57 rss: 70Mb L: 23/27 MS: 1 InsertRepeatedBytes- 00:06:39.527 [2024-11-29 09:30:02.330066] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10692) > buf size (4096) 00:06:39.527 [2024-11-29 09:30:02.330295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a700000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:39.527 [2024-11-29 09:30:02.330321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:39.527 #58 NEW cov: 11896 ft: 15313 corp: 35/418b lim: 30 exec/s: 29 rss: 70Mb L: 9/27 MS: 1 ChangeByte- 00:06:39.527 #58 DONE cov: 11896 ft: 15313 corp: 35/418b lim: 30 exec/s: 29 rss: 70Mb 00:06:39.527 ###### Recommended dictionary. ###### 00:06:39.527 "p\000\000\000\000\000\000\000" # Uses: 3 00:06:39.527 ###### End of recommended dictionary. ###### 00:06:39.527 Done 58 runs in 2 second(s) 00:06:39.786 09:30:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:06:39.786 09:30:02 -- ../common.sh@72 -- # (( i++ )) 00:06:39.786 09:30:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:39.786 09:30:02 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:06:39.786 09:30:02 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:06:39.786 09:30:02 -- nvmf/run.sh@24 -- # local timen=1 00:06:39.786 09:30:02 -- nvmf/run.sh@25 -- # local core=0x1 00:06:39.786 09:30:02 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:39.786 09:30:02 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:06:39.786 09:30:02 -- nvmf/run.sh@29 -- # printf %02d 2 00:06:39.786 09:30:02 -- nvmf/run.sh@29 -- # port=4402 00:06:39.786 09:30:02 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:39.786 09:30:02 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:06:39.786 09:30:02 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:39.786 09:30:02 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:06:39.786 [2024-11-29 09:30:02.506843] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:39.786 [2024-11-29 09:30:02.506905] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3179046 ] 00:06:39.786 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.045 [2024-11-29 09:30:02.684923] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.045 [2024-11-29 09:30:02.746371] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:40.045 [2024-11-29 09:30:02.746515] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.045 [2024-11-29 09:30:02.804445] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:40.045 [2024-11-29 09:30:02.820825] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:06:40.045 INFO: Running with entropic power schedule (0xFF, 100). 00:06:40.045 INFO: Seed: 1283537862 00:06:40.045 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:40.045 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:40.045 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:06:40.045 INFO: A corpus is not provided, starting from an empty corpus 00:06:40.045 #2 INITED exec/s: 0 rss: 61Mb 00:06:40.045 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:40.045 This may also happen if the target rejected all inputs we tried so far 00:06:40.304 [2024-11-29 09:30:02.896683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.304 [2024-11-29 09:30:02.896720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.563 NEW_FUNC[1/670]: 0x43db78 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:06:40.563 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:40.563 #4 NEW cov: 11580 ft: 11581 corp: 2/8b lim: 35 exec/s: 0 rss: 68Mb L: 7/7 MS: 2 InsertRepeatedBytes-CrossOver- 00:06:40.563 [2024-11-29 09:30:03.227633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff00000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.563 [2024-11-29 09:30:03.227686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.563 #10 NEW cov: 11693 ft: 12229 corp: 3/16b lim: 35 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertByte- 00:06:40.563 [2024-11-29 09:30:03.277584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.563 [2024-11-29 09:30:03.277620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.563 #16 NEW cov: 11699 ft: 12478 corp: 4/24b lim: 35 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 ShuffleBytes- 00:06:40.563 [2024-11-29 09:30:03.327850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.563 [2024-11-29 09:30:03.327878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.563 #17 NEW cov: 11784 ft: 12764 corp: 5/33b lim: 35 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CrossOver- 00:06:40.563 [2024-11-29 09:30:03.367966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.563 [2024-11-29 09:30:03.367996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.563 #18 NEW cov: 11784 ft: 12836 corp: 6/42b lim: 35 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CrossOver- 00:06:40.821 [2024-11-29 09:30:03.418100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000a8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.821 [2024-11-29 09:30:03.418128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.821 #19 NEW cov: 11784 ft: 12892 corp: 7/51b lim: 35 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBit- 00:06:40.821 [2024-11-29 09:30:03.458183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00007c cdw11:0a00000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.821 [2024-11-29 09:30:03.458211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.822 #20 NEW cov: 11784 ft: 13014 corp: 8/61b lim: 35 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertByte- 00:06:40.822 [2024-11-29 09:30:03.498302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00007c cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.822 [2024-11-29 09:30:03.498330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.822 #21 NEW cov: 11784 ft: 13066 corp: 9/70b lim: 35 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 EraseBytes- 00:06:40.822 [2024-11-29 09:30:03.538442] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:008a000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.822 [2024-11-29 09:30:03.538471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.822 #22 NEW cov: 11784 ft: 13152 corp: 10/77b lim: 35 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 EraseBytes- 00:06:40.822 [2024-11-29 09:30:03.578491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff92000a cdw11:9f00cfc4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.822 [2024-11-29 09:30:03.578519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.822 #23 NEW cov: 11784 ft: 13210 corp: 11/86b lim: 35 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 CMP- DE: "\377\222\317\304\237\345'\262"- 00:06:40.822 [2024-11-29 09:30:03.618653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00007c cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.822 [2024-11-29 09:30:03.618681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:40.822 #24 NEW cov: 11784 ft: 13225 corp: 12/95b lim: 35 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 ChangeByte- 00:06:40.822 [2024-11-29 09:30:03.658741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:009a000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:40.822 [2024-11-29 09:30:03.658770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.080 #25 NEW cov: 11784 ft: 13249 corp: 13/102b lim: 35 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 ChangeBit- 00:06:41.080 [2024-11-29 09:30:03.698840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:007a000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.080 [2024-11-29 09:30:03.698868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.080 #26 NEW cov: 11784 ft: 13290 corp: 14/109b lim: 35 exec/s: 0 rss: 69Mb L: 7/10 MS: 1 ChangeByte- 00:06:41.080 [2024-11-29 09:30:03.748908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:10100010 cdw11:10001010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.080 [2024-11-29 09:30:03.748940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.080 [2024-11-29 09:30:03.749076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:10100010 cdw11:8a001010 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.080 [2024-11-29 09:30:03.749096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.080 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:41.080 #31 NEW cov: 11807 ft: 13640 corp: 15/126b lim: 35 exec/s: 0 rss: 69Mb L: 17/17 MS: 5 EraseBytes-ChangeByte-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:06:41.080 [2024-11-29 09:30:03.799226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:0a00000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.080 [2024-11-29 09:30:03.799256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.080 #32 NEW cov: 11807 ft: 13798 corp: 16/138b lim: 35 exec/s: 0 rss: 69Mb L: 12/17 MS: 1 CrossOver- 00:06:41.080 [2024-11-29 09:30:03.849352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.080 [2024-11-29 09:30:03.849381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.080 #33 NEW cov: 11807 ft: 13807 corp: 17/147b lim: 35 exec/s: 33 rss: 69Mb L: 9/17 MS: 1 ChangeByte- 00:06:41.080 [2024-11-29 09:30:03.889067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a007c cdw11:0a00000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.080 [2024-11-29 09:30:03.889096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.080 #34 NEW cov: 11807 ft: 13836 corp: 18/157b lim: 35 exec/s: 34 rss: 69Mb L: 10/17 MS: 1 CopyPart- 00:06:41.339 [2024-11-29 09:30:03.939819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff0a00ff cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.339 [2024-11-29 09:30:03.939848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.339 #35 NEW cov: 11807 ft: 13939 corp: 19/166b lim: 35 exec/s: 35 rss: 69Mb L: 9/17 MS: 1 CopyPart- 00:06:41.339 [2024-11-29 09:30:03.989100] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:41.339 [2024-11-29 09:30:03.989455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a7a0000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.339 [2024-11-29 09:30:03.989492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.339 #36 NEW cov: 11816 ft: 13979 corp: 20/173b lim: 35 exec/s: 36 rss: 69Mb L: 7/17 MS: 1 ShuffleBytes- 00:06:41.339 [2024-11-29 09:30:04.029507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:009a000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.339 [2024-11-29 09:30:04.029537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.339 #37 NEW cov: 11816 ft: 13984 corp: 21/180b lim: 35 exec/s: 37 rss: 69Mb L: 7/17 MS: 1 CrossOver- 00:06:41.339 [2024-11-29 09:30:04.069619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0077000a cdw11:0a009a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.339 [2024-11-29 09:30:04.069649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.339 #38 NEW cov: 11816 ft: 13988 corp: 22/188b lim: 35 exec/s: 38 rss: 69Mb L: 8/17 MS: 1 InsertByte- 00:06:41.339 [2024-11-29 09:30:04.120212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff00000a cdw11:00000007 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.339 [2024-11-29 09:30:04.120242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.339 #39 NEW cov: 11816 ft: 14004 corp: 23/196b lim: 35 exec/s: 39 rss: 69Mb L: 8/17 MS: 1 ChangeBinInt- 00:06:41.339 [2024-11-29 09:30:04.170411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:0a000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.339 [2024-11-29 09:30:04.170440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.597 #40 NEW cov: 11816 ft: 14060 corp: 24/209b lim: 35 exec/s: 40 rss: 69Mb L: 13/17 MS: 1 CrossOver- 00:06:41.597 [2024-11-29 09:30:04.220099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00007c cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.597 [2024-11-29 09:30:04.220128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.597 #41 NEW cov: 11816 ft: 14123 corp: 25/222b lim: 35 exec/s: 41 rss: 69Mb L: 13/17 MS: 1 CopyPart- 00:06:41.597 [2024-11-29 09:30:04.270736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00009a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.597 [2024-11-29 09:30:04.270765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.597 #42 NEW cov: 11816 ft: 14164 corp: 26/229b lim: 35 exec/s: 42 rss: 70Mb L: 7/17 MS: 1 ShuffleBytes- 00:06:41.597 [2024-11-29 09:30:04.331363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ff00000a cdw11:ff00000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.597 [2024-11-29 09:30:04.331396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.598 [2024-11-29 09:30:04.331529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.598 [2024-11-29 09:30:04.331547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.598 [2024-11-29 09:30:04.331670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.598 [2024-11-29 09:30:04.331689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.598 #43 NEW cov: 11816 ft: 14398 corp: 27/253b lim: 35 exec/s: 43 rss: 70Mb L: 24/24 MS: 1 InsertRepeatedBytes- 00:06:41.598 [2024-11-29 09:30:04.380712] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:41.598 [2024-11-29 09:30:04.381083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:93cf0000 cdw11:4d00c515 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.598 [2024-11-29 09:30:04.381117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.598 #44 NEW cov: 11816 ft: 14407 corp: 28/262b lim: 35 exec/s: 44 rss: 70Mb L: 9/24 MS: 1 CMP- DE: "\000\223\317\305\025M\266\224"- 00:06:41.598 [2024-11-29 09:30:04.420763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a008a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.598 [2024-11-29 09:30:04.420792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.598 #45 NEW cov: 11816 ft: 14434 corp: 29/271b lim: 35 exec/s: 45 rss: 70Mb L: 9/24 MS: 1 ShuffleBytes- 00:06:41.857 [2024-11-29 09:30:04.471793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:ff000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.471826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.857 [2024-11-29 09:30:04.471945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c49f00cf cdw11:b200e527 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.471964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.857 [2024-11-29 09:30:04.472083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:008a000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.472101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.857 #46 NEW cov: 11816 ft: 14494 corp: 30/292b lim: 35 exec/s: 46 rss: 70Mb L: 21/24 MS: 1 PersAutoDict- DE: "\377\222\317\304\237\345'\262"- 00:06:41.857 [2024-11-29 09:30:04.521272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a23007c cdw11:0a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.521301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.857 #48 NEW cov: 11816 ft: 14542 corp: 31/299b lim: 35 exec/s: 48 rss: 70Mb L: 7/24 MS: 2 EraseBytes-InsertByte- 00:06:41.857 [2024-11-29 09:30:04.550977] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:41.857 [2024-11-29 09:30:04.551349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0077000a cdw11:00009a01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.551377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.857 [2024-11-29 09:30:04.551499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.551521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.857 #49 NEW cov: 11816 ft: 14559 corp: 32/315b lim: 35 exec/s: 49 rss: 70Mb L: 16/24 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:06:41.857 [2024-11-29 09:30:04.591662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:0a000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.591690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.857 #50 NEW cov: 11816 ft: 14564 corp: 33/328b lim: 35 exec/s: 50 rss: 70Mb L: 13/24 MS: 1 ShuffleBytes- 00:06:41.857 [2024-11-29 09:30:04.632215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.632245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.857 [2024-11-29 09:30:04.632364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.632382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:41.857 [2024-11-29 09:30:04.632501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:0000ff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.632518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:41.857 #51 NEW cov: 11816 ft: 14576 corp: 34/355b lim: 35 exec/s: 51 rss: 70Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:06:41.857 [2024-11-29 09:30:04.671877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:41.857 [2024-11-29 09:30:04.671908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:41.857 #52 NEW cov: 11816 ft: 14583 corp: 35/366b lim: 35 exec/s: 52 rss: 70Mb L: 11/27 MS: 1 EraseBytes- 00:06:42.116 [2024-11-29 09:30:04.712090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.712119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.116 #53 NEW cov: 11816 ft: 14587 corp: 36/374b lim: 35 exec/s: 53 rss: 70Mb L: 8/27 MS: 1 CopyPart- 00:06:42.116 [2024-11-29 09:30:04.751992] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:42.116 [2024-11-29 09:30:04.752361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:0a000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.752390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.116 [2024-11-29 09:30:04.752507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:8a000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.752526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.116 #54 NEW cov: 11816 ft: 14589 corp: 37/388b lim: 35 exec/s: 54 rss: 70Mb L: 14/27 MS: 1 InsertByte- 00:06:42.116 [2024-11-29 09:30:04.792304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:088a000a cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.792334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.116 #55 NEW cov: 11816 ft: 14593 corp: 38/395b lim: 35 exec/s: 55 rss: 70Mb L: 7/27 MS: 1 ChangeBit- 00:06:42.116 [2024-11-29 09:30:04.832297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a00000a cdw11:0a000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.832326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.116 #56 NEW cov: 11816 ft: 14597 corp: 39/404b lim: 35 exec/s: 56 rss: 70Mb L: 9/27 MS: 1 EraseBytes- 00:06:42.116 [2024-11-29 09:30:04.872249] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:06:42.116 [2024-11-29 09:30:04.872604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a008a cdw11:01000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.872633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:42.116 [2024-11-29 09:30:04.872759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:06:42.116 [2024-11-29 09:30:04.872781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:42.116 #57 NEW cov: 11816 ft: 14601 corp: 40/421b lim: 35 exec/s: 28 rss: 70Mb L: 17/27 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:06:42.116 #57 DONE cov: 11816 ft: 14601 corp: 40/421b lim: 35 exec/s: 28 rss: 70Mb 00:06:42.116 ###### Recommended dictionary. ###### 00:06:42.116 "\377\222\317\304\237\345'\262" # Uses: 1 00:06:42.116 "\000\223\317\305\025M\266\224" # Uses: 0 00:06:42.116 "\001\000\000\000\000\000\000\000" # Uses: 1 00:06:42.116 ###### End of recommended dictionary. ###### 00:06:42.116 Done 57 runs in 2 second(s) 00:06:42.375 09:30:05 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:06:42.375 09:30:05 -- ../common.sh@72 -- # (( i++ )) 00:06:42.375 09:30:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:42.375 09:30:05 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:06:42.375 09:30:05 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:06:42.375 09:30:05 -- nvmf/run.sh@24 -- # local timen=1 00:06:42.375 09:30:05 -- nvmf/run.sh@25 -- # local core=0x1 00:06:42.375 09:30:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:42.375 09:30:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:06:42.375 09:30:05 -- nvmf/run.sh@29 -- # printf %02d 3 00:06:42.375 09:30:05 -- nvmf/run.sh@29 -- # port=4403 00:06:42.375 09:30:05 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:42.375 09:30:05 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:06:42.375 09:30:05 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:42.375 09:30:05 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:06:42.375 [2024-11-29 09:30:05.065661] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:42.375 [2024-11-29 09:30:05.065726] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3179890 ] 00:06:42.375 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.633 [2024-11-29 09:30:05.325108] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.633 [2024-11-29 09:30:05.417959] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:42.633 [2024-11-29 09:30:05.418092] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.633 [2024-11-29 09:30:05.476146] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:42.892 [2024-11-29 09:30:05.492452] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:06:42.892 INFO: Running with entropic power schedule (0xFF, 100). 00:06:42.892 INFO: Seed: 3953540373 00:06:42.892 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:42.892 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:42.892 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:06:42.892 INFO: A corpus is not provided, starting from an empty corpus 00:06:42.892 #2 INITED exec/s: 0 rss: 61Mb 00:06:42.892 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:42.892 This may also happen if the target rejected all inputs we tried so far 00:06:43.152 NEW_FUNC[1/659]: 0x43f858 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:06:43.152 NEW_FUNC[2/659]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:43.152 #9 NEW cov: 11490 ft: 11491 corp: 2/10b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 2 ChangeBit-InsertRepeatedBytes- 00:06:43.152 #15 NEW cov: 11603 ft: 12298 corp: 3/19b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBit- 00:06:43.152 #16 NEW cov: 11609 ft: 12584 corp: 4/28b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CrossOver- 00:06:43.411 #17 NEW cov: 11694 ft: 12837 corp: 5/37b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBinInt- 00:06:43.411 #18 NEW cov: 11694 ft: 12994 corp: 6/46b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBit- 00:06:43.411 #19 NEW cov: 11694 ft: 13042 corp: 7/54b lim: 20 exec/s: 0 rss: 68Mb L: 8/9 MS: 1 EraseBytes- 00:06:43.411 #20 NEW cov: 11694 ft: 13101 corp: 8/63b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeByte- 00:06:43.411 #21 NEW cov: 11694 ft: 13133 corp: 9/72b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 CrossOver- 00:06:43.670 #22 NEW cov: 11694 ft: 13162 corp: 10/81b lim: 20 exec/s: 0 rss: 68Mb L: 9/9 MS: 1 ChangeBinInt- 00:06:43.670 #23 NEW cov: 11694 ft: 13459 corp: 11/86b lim: 20 exec/s: 0 rss: 69Mb L: 5/9 MS: 1 EraseBytes- 00:06:43.670 NEW_FUNC[1/4]: 0x111e188 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3224 00:06:43.670 NEW_FUNC[2/4]: 0x111ed08 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3166 00:06:43.670 #27 NEW cov: 11778 ft: 13582 corp: 12/96b lim: 20 exec/s: 0 rss: 69Mb L: 10/10 MS: 4 ChangeByte-InsertByte-CrossOver-CrossOver- 00:06:43.670 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:43.670 #28 NEW cov: 11801 ft: 13621 corp: 13/106b lim: 20 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:06:43.670 #29 NEW cov: 11801 ft: 13630 corp: 14/115b lim: 20 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 CopyPart- 00:06:43.929 #30 NEW cov: 11818 ft: 14008 corp: 15/131b lim: 20 exec/s: 30 rss: 69Mb L: 16/16 MS: 1 CrossOver- 00:06:43.929 #31 NEW cov: 11818 ft: 14024 corp: 16/141b lim: 20 exec/s: 31 rss: 69Mb L: 10/16 MS: 1 CopyPart- 00:06:43.929 #32 NEW cov: 11818 ft: 14040 corp: 17/150b lim: 20 exec/s: 32 rss: 69Mb L: 9/16 MS: 1 ShuffleBytes- 00:06:43.929 #33 NEW cov: 11818 ft: 14070 corp: 18/158b lim: 20 exec/s: 33 rss: 69Mb L: 8/16 MS: 1 EraseBytes- 00:06:43.929 #34 NEW cov: 11818 ft: 14089 corp: 19/176b lim: 20 exec/s: 34 rss: 69Mb L: 18/18 MS: 1 CMP- DE: "\001\016"- 00:06:44.189 #35 NEW cov: 11822 ft: 14183 corp: 20/188b lim: 20 exec/s: 35 rss: 69Mb L: 12/18 MS: 1 CopyPart- 00:06:44.189 #36 NEW cov: 11822 ft: 14185 corp: 21/197b lim: 20 exec/s: 36 rss: 69Mb L: 9/18 MS: 1 CrossOver- 00:06:44.189 #37 NEW cov: 11822 ft: 14211 corp: 22/215b lim: 20 exec/s: 37 rss: 69Mb L: 18/18 MS: 1 ShuffleBytes- 00:06:44.189 #38 NEW cov: 11822 ft: 14225 corp: 23/231b lim: 20 exec/s: 38 rss: 69Mb L: 16/18 MS: 1 ChangeBinInt- 00:06:44.189 #39 NEW cov: 11822 ft: 14238 corp: 24/250b lim: 20 exec/s: 39 rss: 69Mb L: 19/19 MS: 1 InsertByte- 00:06:44.448 #40 NEW cov: 11822 ft: 14267 corp: 25/260b lim: 20 exec/s: 40 rss: 69Mb L: 10/19 MS: 1 CopyPart- 00:06:44.448 #41 NEW cov: 11822 ft: 14304 corp: 26/269b lim: 20 exec/s: 41 rss: 69Mb L: 9/19 MS: 1 ChangeByte- 00:06:44.448 #42 NEW cov: 11822 ft: 14317 corp: 27/287b lim: 20 exec/s: 42 rss: 69Mb L: 18/19 MS: 1 ChangeBit- 00:06:44.448 #43 NEW cov: 11822 ft: 14329 corp: 28/297b lim: 20 exec/s: 43 rss: 70Mb L: 10/19 MS: 1 CMP- DE: "\001\000\177T|\016\\\330"- 00:06:44.448 #44 NEW cov: 11822 ft: 14337 corp: 29/305b lim: 20 exec/s: 44 rss: 70Mb L: 8/19 MS: 1 EraseBytes- 00:06:44.707 #45 NEW cov: 11822 ft: 14345 corp: 30/317b lim: 20 exec/s: 45 rss: 70Mb L: 12/19 MS: 1 InsertRepeatedBytes- 00:06:44.707 #46 NEW cov: 11822 ft: 14355 corp: 31/334b lim: 20 exec/s: 46 rss: 70Mb L: 17/19 MS: 1 CopyPart- 00:06:44.707 #47 NEW cov: 11822 ft: 14367 corp: 32/351b lim: 20 exec/s: 47 rss: 70Mb L: 17/19 MS: 1 EraseBytes- 00:06:44.707 #48 NEW cov: 11822 ft: 14380 corp: 33/368b lim: 20 exec/s: 48 rss: 70Mb L: 17/19 MS: 1 ChangeByte- 00:06:44.707 #49 NEW cov: 11822 ft: 14383 corp: 34/385b lim: 20 exec/s: 24 rss: 70Mb L: 17/19 MS: 1 CMP- DE: "\036\000\000\000"- 00:06:44.707 #49 DONE cov: 11822 ft: 14383 corp: 34/385b lim: 20 exec/s: 24 rss: 70Mb 00:06:44.707 ###### Recommended dictionary. ###### 00:06:44.707 "\001\016" # Uses: 0 00:06:44.707 "\001\000\177T|\016\\\330" # Uses: 0 00:06:44.707 "\036\000\000\000" # Uses: 0 00:06:44.707 ###### End of recommended dictionary. ###### 00:06:44.707 Done 49 runs in 2 second(s) 00:06:44.967 09:30:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:06:44.967 09:30:07 -- ../common.sh@72 -- # (( i++ )) 00:06:44.967 09:30:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:44.967 09:30:07 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:06:44.967 09:30:07 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:06:44.967 09:30:07 -- nvmf/run.sh@24 -- # local timen=1 00:06:44.967 09:30:07 -- nvmf/run.sh@25 -- # local core=0x1 00:06:44.967 09:30:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:44.967 09:30:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:06:44.967 09:30:07 -- nvmf/run.sh@29 -- # printf %02d 4 00:06:44.967 09:30:07 -- nvmf/run.sh@29 -- # port=4404 00:06:44.967 09:30:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:44.967 09:30:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:06:44.967 09:30:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:44.967 09:30:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:06:44.967 [2024-11-29 09:30:07.706397] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:44.968 [2024-11-29 09:30:07.706462] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3180532 ] 00:06:44.968 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.227 [2024-11-29 09:30:07.967762] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.227 [2024-11-29 09:30:08.058799] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:45.227 [2024-11-29 09:30:08.058943] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.487 [2024-11-29 09:30:08.116864] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:45.487 [2024-11-29 09:30:08.133257] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:06:45.487 INFO: Running with entropic power schedule (0xFF, 100). 00:06:45.487 INFO: Seed: 2301578241 00:06:45.487 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:45.487 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:45.487 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:06:45.487 INFO: A corpus is not provided, starting from an empty corpus 00:06:45.487 #2 INITED exec/s: 0 rss: 60Mb 00:06:45.487 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:45.487 This may also happen if the target rejected all inputs we tried so far 00:06:45.487 [2024-11-29 09:30:08.188378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.487 [2024-11-29 09:30:08.188406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.746 NEW_FUNC[1/671]: 0x440958 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:06:45.746 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:45.746 #5 NEW cov: 11601 ft: 11602 corp: 2/14b lim: 35 exec/s: 0 rss: 67Mb L: 13/13 MS: 3 ChangeBinInt-CrossOver-InsertRepeatedBytes- 00:06:45.746 [2024-11-29 09:30:08.489517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.489548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.746 [2024-11-29 09:30:08.489607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.489621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.746 [2024-11-29 09:30:08.489677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.489690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:45.746 #6 NEW cov: 11714 ft: 12731 corp: 3/35b lim: 35 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:06:45.746 [2024-11-29 09:30:08.539614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c6c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.539644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:45.746 [2024-11-29 09:30:08.539716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.539731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:45.746 [2024-11-29 09:30:08.539794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.539808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:45.746 #10 NEW cov: 11720 ft: 12962 corp: 4/58b lim: 35 exec/s: 0 rss: 68Mb L: 23/23 MS: 4 InsertRepeatedBytes-CrossOver-InsertByte-InsertRepeatedBytes- 00:06:45.746 [2024-11-29 09:30:08.579396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff0aff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:45.746 [2024-11-29 09:30:08.579422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.006 #12 NEW cov: 11805 ft: 13328 corp: 5/69b lim: 35 exec/s: 0 rss: 68Mb L: 11/23 MS: 2 CopyPart-InsertRepeatedBytes- 00:06:46.006 [2024-11-29 09:30:08.619778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.619805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.619859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.619873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.619926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.619939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.006 #13 NEW cov: 11805 ft: 13523 corp: 6/96b lim: 35 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:06:46.006 [2024-11-29 09:30:08.660225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.660250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.660304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0100c6c6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.660318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.660370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3fc60000 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.660383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.660437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.660450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.660506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:c6ffc6c6 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.660521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:46.006 #14 NEW cov: 11805 ft: 14088 corp: 7/131b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 CMP- DE: "\001\000\000\000\000\000\000?"- 00:06:46.006 [2024-11-29 09:30:08.709926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.709952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.006 [2024-11-29 09:30:08.710008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.006 [2024-11-29 09:30:08.710021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.006 #15 NEW cov: 11805 ft: 14436 corp: 8/150b lim: 35 exec/s: 0 rss: 68Mb L: 19/35 MS: 1 InsertRepeatedBytes- 00:06:46.007 [2024-11-29 09:30:08.750198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.007 [2024-11-29 09:30:08.750223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.007 [2024-11-29 09:30:08.750279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.007 [2024-11-29 09:30:08.750292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.007 [2024-11-29 09:30:08.750346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.007 [2024-11-29 09:30:08.750376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.007 #16 NEW cov: 11805 ft: 14517 corp: 9/173b lim: 35 exec/s: 0 rss: 68Mb L: 23/35 MS: 1 ChangeBinInt- 00:06:46.007 [2024-11-29 09:30:08.789977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.007 [2024-11-29 09:30:08.790001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.007 #17 NEW cov: 11805 ft: 14542 corp: 10/184b lim: 35 exec/s: 0 rss: 68Mb L: 11/35 MS: 1 CopyPart- 00:06:46.007 [2024-11-29 09:30:08.830125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.007 [2024-11-29 09:30:08.830151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.267 #18 NEW cov: 11805 ft: 14566 corp: 11/192b lim: 35 exec/s: 0 rss: 68Mb L: 8/35 MS: 1 EraseBytes- 00:06:46.267 [2024-11-29 09:30:08.870543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.870569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:08.870626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.870641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:08.870696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.870711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.267 #19 NEW cov: 11805 ft: 14582 corp: 12/215b lim: 35 exec/s: 0 rss: 68Mb L: 23/35 MS: 1 EraseBytes- 00:06:46.267 [2024-11-29 09:30:08.920742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.920769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:08.920824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.920838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:08.920891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.920904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.267 #20 NEW cov: 11805 ft: 14599 corp: 13/242b lim: 35 exec/s: 0 rss: 68Mb L: 27/35 MS: 1 CopyPart- 00:06:46.267 [2024-11-29 09:30:08.970530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffa2c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:08.970556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.267 #24 NEW cov: 11805 ft: 14603 corp: 14/250b lim: 35 exec/s: 0 rss: 68Mb L: 8/35 MS: 4 InsertByte-ChangeByte-CrossOver-InsertRepeatedBytes- 00:06:46.267 [2024-11-29 09:30:09.001101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:09.001126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:09.001182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:09.001196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:09.001250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:09.001279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.267 [2024-11-29 09:30:09.001333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:09.001347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.267 #25 NEW cov: 11805 ft: 14632 corp: 15/282b lim: 35 exec/s: 0 rss: 68Mb L: 32/35 MS: 1 InsertRepeatedBytes- 00:06:46.267 [2024-11-29 09:30:09.050773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:09.050798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.267 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:46.267 #26 NEW cov: 11828 ft: 14689 corp: 16/294b lim: 35 exec/s: 0 rss: 69Mb L: 12/35 MS: 1 EraseBytes- 00:06:46.267 [2024-11-29 09:30:09.090858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.267 [2024-11-29 09:30:09.090888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.527 #27 NEW cov: 11828 ft: 14739 corp: 17/306b lim: 35 exec/s: 0 rss: 69Mb L: 12/35 MS: 1 CopyPart- 00:06:46.527 [2024-11-29 09:30:09.131468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.131494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.131549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.131563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.131619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.131633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.131685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000c601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.131698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.527 #33 NEW cov: 11828 ft: 14753 corp: 18/338b lim: 35 exec/s: 0 rss: 69Mb L: 32/35 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000?"- 00:06:46.527 [2024-11-29 09:30:09.171596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.171625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.171679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.171693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.171748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.171761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.171814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000c601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.171827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.527 #34 NEW cov: 11828 ft: 14769 corp: 19/370b lim: 35 exec/s: 34 rss: 69Mb L: 32/35 MS: 1 ChangeBit- 00:06:46.527 [2024-11-29 09:30:09.221592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36363036 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.221621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.221677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.221691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.527 [2024-11-29 09:30:09.221745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.221761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.527 #35 NEW cov: 11828 ft: 14799 corp: 20/391b lim: 35 exec/s: 35 rss: 69Mb L: 21/35 MS: 1 ChangeByte- 00:06:46.527 [2024-11-29 09:30:09.261840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01ff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.527 [2024-11-29 09:30:09.261864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.261920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c601c6c6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.261933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.261988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36360000 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.262018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.262072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:3fc60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.262085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.528 #36 NEW cov: 11828 ft: 14814 corp: 21/425b lim: 35 exec/s: 36 rss: 69Mb L: 34/35 MS: 1 CrossOver- 00:06:46.528 [2024-11-29 09:30:09.301962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.301987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.302060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.302074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.302128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.302141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.302195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c6ccc6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.302208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.528 #37 NEW cov: 11828 ft: 14854 corp: 22/457b lim: 35 exec/s: 37 rss: 69Mb L: 32/35 MS: 1 ChangeByte- 00:06:46.528 [2024-11-29 09:30:09.342248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff32d6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.342272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.342329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0100c6c6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.342343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.342395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:3fc60000 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.342412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.342464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.342477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.528 [2024-11-29 09:30:09.342530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:c6ffc6c6 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.528 [2024-11-29 09:30:09.342543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:46.528 #38 NEW cov: 11828 ft: 14888 corp: 23/492b lim: 35 exec/s: 38 rss: 69Mb L: 35/35 MS: 1 ChangeBit- 00:06:46.787 [2024-11-29 09:30:09.381880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.787 [2024-11-29 09:30:09.381905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.381960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:36363636 cdw11:36470002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.381973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.788 #39 NEW cov: 11828 ft: 14914 corp: 24/511b lim: 35 exec/s: 39 rss: 69Mb L: 19/35 MS: 1 ChangeBit- 00:06:46.788 [2024-11-29 09:30:09.422177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.422202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.422258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.422271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.422324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6ffc6c6 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.422337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.462267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.462293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.462348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c7c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.462362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.462416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6ffc6c6 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.462429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.788 #41 NEW cov: 11828 ft: 14938 corp: 25/532b lim: 35 exec/s: 41 rss: 69Mb L: 21/35 MS: 2 EraseBytes-ChangeBit- 00:06:46.788 [2024-11-29 09:30:09.502074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.502103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 #42 NEW cov: 11828 ft: 15024 corp: 26/543b lim: 35 exec/s: 42 rss: 69Mb L: 11/35 MS: 1 EraseBytes- 00:06:46.788 [2024-11-29 09:30:09.542181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.542207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 #43 NEW cov: 11828 ft: 15105 corp: 27/554b lim: 35 exec/s: 43 rss: 69Mb L: 11/35 MS: 1 ChangeBinInt- 00:06:46.788 [2024-11-29 09:30:09.582819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01ff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.582844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.582901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c601c6c6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.582914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.582967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36360000 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.582980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.583031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36363636 cdw11:36360003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.583044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:46.788 #49 NEW cov: 11828 ft: 15118 corp: 28/588b lim: 35 exec/s: 49 rss: 70Mb L: 34/35 MS: 1 CrossOver- 00:06:46.788 [2024-11-29 09:30:09.622779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.622804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.622875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.622889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:46.788 [2024-11-29 09:30:09.622953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c7 cdw11:c6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:46.788 [2024-11-29 09:30:09.622967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.048 #50 NEW cov: 11828 ft: 15121 corp: 29/611b lim: 35 exec/s: 50 rss: 70Mb L: 23/35 MS: 1 ChangeBit- 00:06:47.048 [2024-11-29 09:30:09.663072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.663096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.663150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.663164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.663219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.663235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.663287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:c6ffc626 cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.663300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.048 #51 NEW cov: 11828 ft: 15205 corp: 30/639b lim: 35 exec/s: 51 rss: 70Mb L: 28/35 MS: 1 InsertByte- 00:06:47.048 [2024-11-29 09:30:09.703181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.703205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.703259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.703273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.703326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.703339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.703391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000c601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.703404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.048 #52 NEW cov: 11828 ft: 15241 corp: 31/671b lim: 35 exec/s: 52 rss: 70Mb L: 32/35 MS: 1 CopyPart- 00:06:47.048 [2024-11-29 09:30:09.742948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.742973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.743028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.743041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.048 #53 NEW cov: 11828 ft: 15280 corp: 32/686b lim: 35 exec/s: 53 rss: 70Mb L: 15/35 MS: 1 CrossOver- 00:06:47.048 [2024-11-29 09:30:09.783424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:aaaa0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.783449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.783505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6aac6 cdw11:00c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.783519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.783571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.783584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.783643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000c601 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.783659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.048 #54 NEW cov: 11828 ft: 15283 corp: 33/718b lim: 35 exec/s: 54 rss: 70Mb L: 32/35 MS: 1 CopyPart- 00:06:47.048 [2024-11-29 09:30:09.823356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.048 [2024-11-29 09:30:09.823381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.048 [2024-11-29 09:30:09.823436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.049 [2024-11-29 09:30:09.823449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.049 [2024-11-29 09:30:09.823501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c62cc6c7 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.049 [2024-11-29 09:30:09.823531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.049 #55 NEW cov: 11828 ft: 15298 corp: 34/742b lim: 35 exec/s: 55 rss: 70Mb L: 24/35 MS: 1 InsertByte- 00:06:47.049 [2024-11-29 09:30:09.863496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31c632c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.049 [2024-11-29 09:30:09.863521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.049 [2024-11-29 09:30:09.863576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.049 [2024-11-29 09:30:09.863589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.049 [2024-11-29 09:30:09.863648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.049 [2024-11-29 09:30:09.863661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.049 #56 NEW cov: 11828 ft: 15304 corp: 35/769b lim: 35 exec/s: 56 rss: 70Mb L: 27/35 MS: 1 CrossOver- 00:06:47.308 [2024-11-29 09:30:09.903199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.903224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.308 #57 NEW cov: 11828 ft: 15329 corp: 36/782b lim: 35 exec/s: 57 rss: 70Mb L: 13/35 MS: 1 ChangeASCIIInt- 00:06:47.308 [2024-11-29 09:30:09.943702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.943727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:09.943781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.943795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:09.943852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c7c6c6c6 cdw11:c6ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.943865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.308 #58 NEW cov: 11828 ft: 15367 corp: 37/805b lim: 35 exec/s: 58 rss: 70Mb L: 23/35 MS: 1 ChangeBit- 00:06:47.308 [2024-11-29 09:30:09.994028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:36360136 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.994053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:09.994109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.994123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:09.994178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.994191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:09.994244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ff36c6ff cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:09.994257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.308 #64 NEW cov: 11828 ft: 15371 corp: 38/835b lim: 35 exec/s: 64 rss: 70Mb L: 30/35 MS: 1 CrossOver- 00:06:47.308 [2024-11-29 09:30:10.033857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:10.033885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:10.033941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffcf0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:10.033956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.308 #65 NEW cov: 11828 ft: 15394 corp: 39/853b lim: 35 exec/s: 65 rss: 70Mb L: 18/35 MS: 1 InsertRepeatedBytes- 00:06:47.308 [2024-11-29 09:30:10.083913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:c6310a32 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:10.083950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:10.084019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c6c6c6c6 cdw11:c6c60003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:10.084038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.308 #66 NEW cov: 11828 ft: 15540 corp: 40/868b lim: 35 exec/s: 66 rss: 70Mb L: 15/35 MS: 1 CrossOver- 00:06:47.308 [2024-11-29 09:30:10.124377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:01ff32c6 cdw11:ffff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:10.124407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:10.124467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:c601c6c6 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.308 [2024-11-29 09:30:10.124482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:47.308 [2024-11-29 09:30:10.124540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:36360000 cdw11:36360000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.309 [2024-11-29 09:30:10.124557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:47.309 [2024-11-29 09:30:10.124615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:36ca3636 cdw11:c0390000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.309 [2024-11-29 09:30:10.124629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:47.309 #67 NEW cov: 11828 ft: 15543 corp: 41/902b lim: 35 exec/s: 67 rss: 70Mb L: 34/35 MS: 1 ChangeBinInt- 00:06:47.568 [2024-11-29 09:30:10.163995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:31aa32c6 cdw11:32aa0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:47.568 [2024-11-29 09:30:10.164022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:47.568 #68 NEW cov: 11828 ft: 15567 corp: 42/910b lim: 35 exec/s: 34 rss: 70Mb L: 8/35 MS: 1 CrossOver- 00:06:47.568 #68 DONE cov: 11828 ft: 15567 corp: 42/910b lim: 35 exec/s: 34 rss: 70Mb 00:06:47.568 ###### Recommended dictionary. ###### 00:06:47.568 "\001\000\000\000\000\000\000?" # Uses: 1 00:06:47.568 ###### End of recommended dictionary. ###### 00:06:47.568 Done 68 runs in 2 second(s) 00:06:47.568 09:30:10 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:06:47.568 09:30:10 -- ../common.sh@72 -- # (( i++ )) 00:06:47.568 09:30:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:47.568 09:30:10 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:06:47.568 09:30:10 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:06:47.568 09:30:10 -- nvmf/run.sh@24 -- # local timen=1 00:06:47.568 09:30:10 -- nvmf/run.sh@25 -- # local core=0x1 00:06:47.568 09:30:10 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:47.568 09:30:10 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:06:47.568 09:30:10 -- nvmf/run.sh@29 -- # printf %02d 5 00:06:47.568 09:30:10 -- nvmf/run.sh@29 -- # port=4405 00:06:47.568 09:30:10 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:47.568 09:30:10 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:06:47.568 09:30:10 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:47.568 09:30:10 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:06:47.568 [2024-11-29 09:30:10.360327] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:47.568 [2024-11-29 09:30:10.360413] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3180947 ] 00:06:47.568 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.828 [2024-11-29 09:30:10.549225] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.828 [2024-11-29 09:30:10.614953] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:47.828 [2024-11-29 09:30:10.615085] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.087 [2024-11-29 09:30:10.673529] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:48.087 [2024-11-29 09:30:10.689910] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:06:48.087 INFO: Running with entropic power schedule (0xFF, 100). 00:06:48.087 INFO: Seed: 562614700 00:06:48.087 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:48.087 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:48.087 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:06:48.087 INFO: A corpus is not provided, starting from an empty corpus 00:06:48.087 #2 INITED exec/s: 0 rss: 61Mb 00:06:48.087 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:48.087 This may also happen if the target rejected all inputs we tried so far 00:06:48.087 [2024-11-29 09:30:10.735061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.087 [2024-11-29 09:30:10.735090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.347 NEW_FUNC[1/671]: 0x442af8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:06:48.347 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:48.347 #14 NEW cov: 11612 ft: 11613 corp: 2/10b lim: 45 exec/s: 0 rss: 68Mb L: 9/9 MS: 2 CopyPart-CMP- DE: "\001\000\000\000\000\000\000m"- 00:06:48.347 [2024-11-29 09:30:11.035785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.347 [2024-11-29 09:30:11.035818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.347 #23 NEW cov: 11725 ft: 12027 corp: 3/22b lim: 45 exec/s: 0 rss: 68Mb L: 12/12 MS: 4 InsertByte-InsertByte-InsertByte-CMP- DE: "\000\223\317\310\364\011q`"- 00:06:48.347 [2024-11-29 09:30:11.075747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.347 [2024-11-29 09:30:11.075773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.347 #24 NEW cov: 11731 ft: 12314 corp: 4/31b lim: 45 exec/s: 0 rss: 68Mb L: 9/12 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000m"- 00:06:48.347 [2024-11-29 09:30:11.105862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:001a0100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.347 [2024-11-29 09:30:11.105888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.347 #25 NEW cov: 11816 ft: 12513 corp: 5/40b lim: 45 exec/s: 0 rss: 68Mb L: 9/12 MS: 1 ChangeByte- 00:06:48.347 [2024-11-29 09:30:11.145998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000e201 cdw11:1a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.347 [2024-11-29 09:30:11.146023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.347 #30 NEW cov: 11816 ft: 12663 corp: 6/50b lim: 45 exec/s: 0 rss: 68Mb L: 10/12 MS: 5 InsertByte-ChangeByte-ShuffleBytes-EraseBytes-CrossOver- 00:06:48.347 [2024-11-29 09:30:11.176059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:001a0100 cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.347 [2024-11-29 09:30:11.176085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 #31 NEW cov: 11816 ft: 12715 corp: 7/60b lim: 45 exec/s: 0 rss: 68Mb L: 10/12 MS: 1 InsertByte- 00:06:48.606 [2024-11-29 09:30:11.216161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1efe cdw11:e5ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.216186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 #32 NEW cov: 11816 ft: 12774 corp: 8/70b lim: 45 exec/s: 0 rss: 69Mb L: 10/12 MS: 1 ChangeBinInt- 00:06:48.606 [2024-11-29 09:30:11.256275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:80000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.256300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 #33 NEW cov: 11816 ft: 12885 corp: 9/79b lim: 45 exec/s: 0 rss: 69Mb L: 9/12 MS: 1 ChangeBit- 00:06:48.606 [2024-11-29 09:30:11.296522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.296547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 [2024-11-29 09:30:11.296605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffc8272d cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.296618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.606 #35 NEW cov: 11816 ft: 13658 corp: 10/97b lim: 45 exec/s: 0 rss: 69Mb L: 18/18 MS: 2 CrossOver-CrossOver- 00:06:48.606 [2024-11-29 09:30:11.336511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.336536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 #36 NEW cov: 11816 ft: 13720 corp: 11/109b lim: 45 exec/s: 0 rss: 69Mb L: 12/18 MS: 1 ShuffleBytes- 00:06:48.606 [2024-11-29 09:30:11.376645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00750a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.376670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 #37 NEW cov: 11816 ft: 13744 corp: 12/122b lim: 45 exec/s: 0 rss: 69Mb L: 13/18 MS: 1 CMP- DE: "u\000\000\000"- 00:06:48.606 [2024-11-29 09:30:11.416907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.416932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.606 [2024-11-29 09:30:11.416984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffc8272d cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.606 [2024-11-29 09:30:11.416998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.606 #38 NEW cov: 11816 ft: 13758 corp: 13/143b lim: 45 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 CrossOver- 00:06:48.865 [2024-11-29 09:30:11.456866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.456891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.865 #39 NEW cov: 11816 ft: 13776 corp: 14/156b lim: 45 exec/s: 0 rss: 69Mb L: 13/21 MS: 1 InsertByte- 00:06:48.865 [2024-11-29 09:30:11.496989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00750a01 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.497014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.865 #40 NEW cov: 11816 ft: 13844 corp: 15/169b lim: 45 exec/s: 0 rss: 69Mb L: 13/21 MS: 1 ChangeBinInt- 00:06:48.865 [2024-11-29 09:30:11.537114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000e201 cdw11:66000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.537139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.865 #41 NEW cov: 11816 ft: 13864 corp: 16/179b lim: 45 exec/s: 0 rss: 69Mb L: 10/21 MS: 1 ChangeByte- 00:06:48.865 [2024-11-29 09:30:11.577391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.577419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.865 [2024-11-29 09:30:11.577473] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffc8277e cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.577486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.865 #42 NEW cov: 11816 ft: 13880 corp: 17/197b lim: 45 exec/s: 0 rss: 69Mb L: 18/21 MS: 1 ChangeByte- 00:06:48.865 [2024-11-29 09:30:11.617346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:b6000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.617371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.865 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:48.865 #43 NEW cov: 11839 ft: 13914 corp: 18/206b lim: 45 exec/s: 0 rss: 69Mb L: 9/21 MS: 1 ChangeByte- 00:06:48.865 [2024-11-29 09:30:11.657936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93ff0a00 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.657963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:48.865 [2024-11-29 09:30:11.658015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.658029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:48.865 [2024-11-29 09:30:11.658082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.658095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:48.865 [2024-11-29 09:30:11.658147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:cfc8ffff cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.658159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:48.865 #44 NEW cov: 11839 ft: 14293 corp: 19/250b lim: 45 exec/s: 0 rss: 69Mb L: 44/44 MS: 1 InsertRepeatedBytes- 00:06:48.865 [2024-11-29 09:30:11.697605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:48.865 [2024-11-29 09:30:11.697647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.124 #45 NEW cov: 11839 ft: 14354 corp: 20/262b lim: 45 exec/s: 0 rss: 69Mb L: 12/44 MS: 1 ShuffleBytes- 00:06:49.124 [2024-11-29 09:30:11.727677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff1efe cdw11:e5ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.727702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.124 #46 NEW cov: 11839 ft: 14422 corp: 21/272b lim: 45 exec/s: 46 rss: 69Mb L: 10/44 MS: 1 ChangeBinInt- 00:06:49.124 [2024-11-29 09:30:11.767927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.767953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.768007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fec8272d cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.768026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.124 #47 NEW cov: 11839 ft: 14427 corp: 22/293b lim: 45 exec/s: 47 rss: 69Mb L: 21/44 MS: 1 ChangeBit- 00:06:49.124 [2024-11-29 09:30:11.808058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.808083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.808136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00007500 cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.808151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.124 #48 NEW cov: 11839 ft: 14472 corp: 23/314b lim: 45 exec/s: 48 rss: 69Mb L: 21/44 MS: 1 PersAutoDict- DE: "u\000\000\000"- 00:06:49.124 [2024-11-29 09:30:11.848501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a5f cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.848527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.848580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.848594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.848651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.848665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.848716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.848729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.124 #53 NEW cov: 11839 ft: 14483 corp: 24/356b lim: 45 exec/s: 53 rss: 69Mb L: 42/44 MS: 5 InsertByte-InsertByte-ShuffleBytes-EraseBytes-InsertRepeatedBytes- 00:06:49.124 [2024-11-29 09:30:11.888607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93ff0a00 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.888632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.888684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.888698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.888748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.888762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:49.124 [2024-11-29 09:30:11.888813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:272d7160 cdw11:ffc80007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.124 [2024-11-29 09:30:11.888825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:49.125 #54 NEW cov: 11839 ft: 14548 corp: 25/394b lim: 45 exec/s: 54 rss: 69Mb L: 38/44 MS: 1 EraseBytes- 00:06:49.125 [2024-11-29 09:30:11.938279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.125 [2024-11-29 09:30:11.938305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.125 #55 NEW cov: 11839 ft: 14616 corp: 26/411b lim: 45 exec/s: 55 rss: 70Mb L: 17/44 MS: 1 EraseBytes- 00:06:49.384 [2024-11-29 09:30:11.978389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d4c80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:11.978414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.384 #56 NEW cov: 11839 ft: 14623 corp: 27/423b lim: 45 exec/s: 56 rss: 70Mb L: 12/44 MS: 1 ChangeBinInt- 00:06:49.384 [2024-11-29 09:30:12.018693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.018718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.384 [2024-11-29 09:30:12.018772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:60270971 cdw11:7eff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.018786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.384 #57 NEW cov: 11839 ft: 14649 corp: 28/442b lim: 45 exec/s: 57 rss: 70Mb L: 19/44 MS: 1 CrossOver- 00:06:49.384 [2024-11-29 09:30:12.058633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75000093 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.058659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.384 #58 NEW cov: 11839 ft: 14655 corp: 29/455b lim: 45 exec/s: 58 rss: 70Mb L: 13/44 MS: 1 PersAutoDict- DE: "u\000\000\000"- 00:06:49.384 [2024-11-29 09:30:12.098923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0093f00a cdw11:cfc80007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.098949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.384 [2024-11-29 09:30:12.098996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00006075 cdw11:00f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.099010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.384 #59 NEW cov: 11839 ft: 14671 corp: 30/477b lim: 45 exec/s: 59 rss: 70Mb L: 22/44 MS: 1 InsertByte- 00:06:49.384 [2024-11-29 09:30:12.138840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:80000100 cdw11:09000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.138867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.384 #60 NEW cov: 11839 ft: 14674 corp: 31/486b lim: 45 exec/s: 60 rss: 70Mb L: 9/44 MS: 1 CMP- DE: "\011\000"- 00:06:49.384 [2024-11-29 09:30:12.178957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cf000093 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.178983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.384 #61 NEW cov: 11839 ft: 14679 corp: 32/498b lim: 45 exec/s: 61 rss: 70Mb L: 12/44 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\006"- 00:06:49.384 [2024-11-29 09:30:12.209054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:75000093 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.384 [2024-11-29 09:30:12.209082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 #62 NEW cov: 11839 ft: 14690 corp: 33/511b lim: 45 exec/s: 62 rss: 70Mb L: 13/44 MS: 1 ChangeBit- 00:06:49.643 [2024-11-29 09:30:12.249206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80083 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.249232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 #63 NEW cov: 11839 ft: 14706 corp: 34/523b lim: 45 exec/s: 63 rss: 70Mb L: 12/44 MS: 1 ChangeBit- 00:06:49.643 [2024-11-29 09:30:12.289287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.289312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 #64 NEW cov: 11839 ft: 14714 corp: 35/536b lim: 45 exec/s: 64 rss: 70Mb L: 13/44 MS: 1 PersAutoDict- DE: "\000\223\317\310\364\011q`"- 00:06:49.643 [2024-11-29 09:30:12.319376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a01 cdw11:0a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.319401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 #66 NEW cov: 11839 ft: 14716 corp: 36/548b lim: 45 exec/s: 66 rss: 70Mb L: 12/44 MS: 2 EraseBytes-CrossOver- 00:06:49.643 [2024-11-29 09:30:12.359507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000109 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.359532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 #67 NEW cov: 11839 ft: 14726 corp: 37/557b lim: 45 exec/s: 67 rss: 70Mb L: 9/44 MS: 1 ChangeBinInt- 00:06:49.643 [2024-11-29 09:30:12.399689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:09710af4 cdw11:60750000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.399713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 #68 NEW cov: 11839 ft: 14736 corp: 38/574b lim: 45 exec/s: 68 rss: 70Mb L: 17/44 MS: 1 EraseBytes- 00:06:49.643 [2024-11-29 09:30:12.439921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.439945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.643 [2024-11-29 09:30:12.440015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffc82b7e cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.440029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.643 #69 NEW cov: 11839 ft: 14746 corp: 39/592b lim: 45 exec/s: 69 rss: 70Mb L: 18/44 MS: 1 ChangeByte- 00:06:49.643 [2024-11-29 09:30:12.479852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.643 [2024-11-29 09:30:12.479876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 #70 NEW cov: 11839 ft: 14750 corp: 40/601b lim: 45 exec/s: 70 rss: 70Mb L: 9/44 MS: 1 EraseBytes- 00:06:49.903 [2024-11-29 09:30:12.510092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.510116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 [2024-11-29 09:30:12.510199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:fec8272d cdw11:15000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.510212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.903 #71 NEW cov: 11839 ft: 14757 corp: 41/622b lim: 45 exec/s: 71 rss: 70Mb L: 21/44 MS: 1 ChangeBinInt- 00:06:49.903 [2024-11-29 09:30:12.550256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.550280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 [2024-11-29 09:30:12.550333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffc8277e cdw11:f40a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.550346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.903 #72 NEW cov: 11839 ft: 14787 corp: 42/643b lim: 45 exec/s: 72 rss: 70Mb L: 21/44 MS: 1 InsertRepeatedBytes- 00:06:49.903 [2024-11-29 09:30:12.590338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.590363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 [2024-11-29 09:30:12.590416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0ac82dff cdw11:f40a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.590430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:49.903 #73 NEW cov: 11839 ft: 14820 corp: 43/661b lim: 45 exec/s: 73 rss: 70Mb L: 18/44 MS: 1 CrossOver- 00:06:49.903 [2024-11-29 09:30:12.630302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc80093 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.630326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 #74 NEW cov: 11839 ft: 14860 corp: 44/674b lim: 45 exec/s: 74 rss: 70Mb L: 13/44 MS: 1 InsertByte- 00:06:49.903 [2024-11-29 09:30:12.660402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cfc800f1 cdw11:f4090003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.660426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 #75 NEW cov: 11839 ft: 14863 corp: 45/686b lim: 45 exec/s: 75 rss: 70Mb L: 12/44 MS: 1 ChangeByte- 00:06:49.903 [2024-11-29 09:30:12.690495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:cf010083 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.690518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:49.903 #76 NEW cov: 11839 ft: 14869 corp: 46/702b lim: 45 exec/s: 76 rss: 70Mb L: 16/44 MS: 1 CrossOver- 00:06:49.903 [2024-11-29 09:30:12.730634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:93cf0a00 cdw11:c8f40000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:49.903 [2024-11-29 09:30:12.730659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.163 #77 NEW cov: 11839 ft: 14876 corp: 47/717b lim: 45 exec/s: 38 rss: 70Mb L: 15/44 MS: 1 EraseBytes- 00:06:50.163 #77 DONE cov: 11839 ft: 14876 corp: 47/717b lim: 45 exec/s: 38 rss: 70Mb 00:06:50.163 ###### Recommended dictionary. ###### 00:06:50.163 "\001\000\000\000\000\000\000m" # Uses: 1 00:06:50.163 "\000\223\317\310\364\011q`" # Uses: 1 00:06:50.163 "u\000\000\000" # Uses: 2 00:06:50.163 "\011\000" # Uses: 0 00:06:50.163 "\000\000\000\000\000\000\000\006" # Uses: 0 00:06:50.163 ###### End of recommended dictionary. ###### 00:06:50.163 Done 77 runs in 2 second(s) 00:06:50.163 09:30:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:06:50.163 09:30:12 -- ../common.sh@72 -- # (( i++ )) 00:06:50.163 09:30:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:50.163 09:30:12 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:06:50.163 09:30:12 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:06:50.163 09:30:12 -- nvmf/run.sh@24 -- # local timen=1 00:06:50.163 09:30:12 -- nvmf/run.sh@25 -- # local core=0x1 00:06:50.163 09:30:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:50.163 09:30:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:06:50.163 09:30:12 -- nvmf/run.sh@29 -- # printf %02d 6 00:06:50.163 09:30:12 -- nvmf/run.sh@29 -- # port=4406 00:06:50.163 09:30:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:50.163 09:30:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:06:50.163 09:30:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:50.163 09:30:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:06:50.163 [2024-11-29 09:30:12.914175] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:50.163 [2024-11-29 09:30:12.914244] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3181368 ] 00:06:50.163 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.422 [2024-11-29 09:30:13.093156] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.422 [2024-11-29 09:30:13.155897] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:50.422 [2024-11-29 09:30:13.156026] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.422 [2024-11-29 09:30:13.213928] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:50.422 [2024-11-29 09:30:13.230320] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:06:50.422 INFO: Running with entropic power schedule (0xFF, 100). 00:06:50.422 INFO: Seed: 3101604225 00:06:50.681 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:50.681 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:50.681 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:06:50.681 INFO: A corpus is not provided, starting from an empty corpus 00:06:50.681 #2 INITED exec/s: 0 rss: 60Mb 00:06:50.681 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:50.681 This may also happen if the target rejected all inputs we tried so far 00:06:50.681 [2024-11-29 09:30:13.299663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001c0a cdw11:00000000 00:06:50.681 [2024-11-29 09:30:13.299699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.941 NEW_FUNC[1/668]: 0x445308 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:06:50.941 NEW_FUNC[2/668]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:50.941 #4 NEW cov: 11527 ft: 11519 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 2 ChangeByte-CrossOver- 00:06:50.941 [2024-11-29 09:30:13.631230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.631283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.631416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.631442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.631567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.631592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.631718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.631737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.941 NEW_FUNC[1/1]: 0x15097e8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:06:50.941 #8 NEW cov: 11642 ft: 12330 corp: 3/12b lim: 10 exec/s: 0 rss: 68Mb L: 9/9 MS: 4 EraseBytes-ChangeBinInt-ChangeBit-InsertRepeatedBytes- 00:06:50.941 [2024-11-29 09:30:13.681371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.681403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.681521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.681543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.681666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.681685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.681792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.681810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.681931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.681947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:50.941 #9 NEW cov: 11648 ft: 12803 corp: 4/22b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:06:50.941 [2024-11-29 09:30:13.721489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.721516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.721622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.721637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.721759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.721777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.721884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.721899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.721999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.722016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:50.941 #10 NEW cov: 11733 ft: 13093 corp: 5/32b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:06:50.941 [2024-11-29 09:30:13.771603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.771630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.771739] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.771754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.771866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9e7 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.771883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.771997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.772013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:50.941 [2024-11-29 09:30:13.772119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:50.941 [2024-11-29 09:30:13.772135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.201 #11 NEW cov: 11733 ft: 13214 corp: 6/42b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeByte- 00:06:51.201 [2024-11-29 09:30:13.811355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001c0a cdw11:00000000 00:06:51.201 [2024-11-29 09:30:13.811382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.201 [2024-11-29 09:30:13.811493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.201 [2024-11-29 09:30:13.811509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.201 [2024-11-29 09:30:13.811622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e76a cdw11:00000000 00:06:51.201 [2024-11-29 09:30:13.811646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.201 #12 NEW cov: 11733 ft: 13411 corp: 7/48b lim: 10 exec/s: 0 rss: 68Mb L: 6/10 MS: 1 CrossOver- 00:06:51.201 [2024-11-29 09:30:13.851814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.201 [2024-11-29 09:30:13.851841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.201 [2024-11-29 09:30:13.851952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.201 [2024-11-29 09:30:13.851971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.201 [2024-11-29 09:30:13.852046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.201 [2024-11-29 09:30:13.852061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.852168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.852186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.852297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.852313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.202 #13 NEW cov: 11733 ft: 13488 corp: 8/58b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:06:51.202 [2024-11-29 09:30:13.891839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.891866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.891984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.892000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.892109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c949 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.892126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.892213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.892230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.892334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.892350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.202 #14 NEW cov: 11733 ft: 13523 corp: 9/68b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 ChangeBit- 00:06:51.202 [2024-11-29 09:30:13.932099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.932124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.932241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.932257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.932367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c949 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.932382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.932449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.932466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.932576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.932592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.202 #15 NEW cov: 11733 ft: 13665 corp: 10/78b lim: 10 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 CopyPart- 00:06:51.202 [2024-11-29 09:30:13.972288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.972316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.972430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.972446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.972552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.972569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.972658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.972674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:13.972731] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:13.972748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.202 #16 NEW cov: 11733 ft: 13685 corp: 11/88b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:06:51.202 [2024-11-29 09:30:14.012336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:14.012362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:14.012471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:14.012490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:14.012602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:14.012616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:14.012721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000000a cdw11:00000000 00:06:51.202 [2024-11-29 09:30:14.012737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.202 [2024-11-29 09:30:14.012843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.202 [2024-11-29 09:30:14.012860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.202 #17 NEW cov: 11733 ft: 13752 corp: 12/98b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:06:51.462 [2024-11-29 09:30:14.052164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001c0a cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.052191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.052304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.052321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.052430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c76a cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.052447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.462 #18 NEW cov: 11733 ft: 13771 corp: 13/104b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 ChangeBit- 00:06:51.462 [2024-11-29 09:30:14.092607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.092633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.092742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.092759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.092868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9d3 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.092883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.092994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.093010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.093116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.093133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.462 #19 NEW cov: 11733 ft: 13798 corp: 14/114b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:06:51.462 [2024-11-29 09:30:14.132810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.132835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.132948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.132965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.133074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.133090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.133206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.133224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.133334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00003fe5 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.133350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.462 #20 NEW cov: 11733 ft: 13867 corp: 15/124b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:06:51.462 [2024-11-29 09:30:14.172134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000020a cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.172159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 #21 NEW cov: 11733 ft: 13950 corp: 16/126b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeBinInt- 00:06:51.462 [2024-11-29 09:30:14.212227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000280a cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.212255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 #22 NEW cov: 11733 ft: 13969 corp: 17/128b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ChangeByte- 00:06:51.462 [2024-11-29 09:30:14.253228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbd9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.253254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.253373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.253389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.253499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.253516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.253636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.253653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.253764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.253781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.462 #23 NEW cov: 11733 ft: 13992 corp: 18/138b lim: 10 exec/s: 23 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:06:51.462 [2024-11-29 09:30:14.292850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.292877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.292993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.293009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.462 [2024-11-29 09:30:14.293119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.462 [2024-11-29 09:30:14.293136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.722 #24 NEW cov: 11733 ft: 14012 corp: 19/144b lim: 10 exec/s: 24 rss: 69Mb L: 6/10 MS: 1 EraseBytes- 00:06:51.722 [2024-11-29 09:30:14.333383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.333409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.333523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.333538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.333648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.333666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.333772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.333788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.333895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.333913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.722 #25 NEW cov: 11733 ft: 14032 corp: 20/154b lim: 10 exec/s: 25 rss: 69Mb L: 10/10 MS: 1 CMP- DE: "\377\377\377\377"- 00:06:51.722 [2024-11-29 09:30:14.373416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.373442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.373554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.373569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.373684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.373700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.373802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.373817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.373925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.373942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.722 #26 NEW cov: 11733 ft: 14042 corp: 21/164b lim: 10 exec/s: 26 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:51.722 [2024-11-29 09:30:14.413395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.413420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.413531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.413547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.413661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000049c9 cdw11:00000000 00:06:51.722 [2024-11-29 09:30:14.413678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.722 [2024-11-29 09:30:14.413792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.413807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.723 #27 NEW cov: 11733 ft: 14043 corp: 22/173b lim: 10 exec/s: 27 rss: 69Mb L: 9/10 MS: 1 EraseBytes- 00:06:51.723 [2024-11-29 09:30:14.453428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.453453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.453568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.453594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.453709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e76a cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.453726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.453855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.453871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.723 #28 NEW cov: 11733 ft: 14069 corp: 23/182b lim: 10 exec/s: 28 rss: 69Mb L: 9/10 MS: 1 EraseBytes- 00:06:51.723 [2024-11-29 09:30:14.493609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.493634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.493750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.493766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.493873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.493890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.494001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.494016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.723 #29 NEW cov: 11733 ft: 14103 corp: 24/191b lim: 10 exec/s: 29 rss: 69Mb L: 9/10 MS: 1 CopyPart- 00:06:51.723 [2024-11-29 09:30:14.533955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.533981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.534093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.534109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.534217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9e7 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.534233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.534339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.534355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.723 [2024-11-29 09:30:14.534463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00006ae5 cdw11:00000000 00:06:51.723 [2024-11-29 09:30:14.534481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.723 #30 NEW cov: 11733 ft: 14113 corp: 25/201b lim: 10 exec/s: 30 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:06:51.983 [2024-11-29 09:30:14.574037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.574063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.574180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.574196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.574308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9e7 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.574323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.574432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.574448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.574552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c94d cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.574569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.983 #31 NEW cov: 11733 ft: 14121 corp: 26/211b lim: 10 exec/s: 31 rss: 69Mb L: 10/10 MS: 1 ChangeByte- 00:06:51.983 [2024-11-29 09:30:14.614111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.614139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.614259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.614276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.614383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a49 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.614400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.614515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.614531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.614629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.614646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:51.983 #32 NEW cov: 11733 ft: 14150 corp: 27/221b lim: 10 exec/s: 32 rss: 69Mb L: 10/10 MS: 1 ChangeBinInt- 00:06:51.983 [2024-11-29 09:30:14.654218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.654245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.654355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.654372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.654485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.654501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.654605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.654620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.983 #33 NEW cov: 11733 ft: 14160 corp: 28/229b lim: 10 exec/s: 33 rss: 69Mb L: 8/10 MS: 1 CrossOver- 00:06:51.983 [2024-11-29 09:30:14.704359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.704390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.704504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.704521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.704640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.704657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.704768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.704784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.983 #34 NEW cov: 11733 ft: 14171 corp: 29/238b lim: 10 exec/s: 34 rss: 69Mb L: 9/10 MS: 1 ShuffleBytes- 00:06:51.983 [2024-11-29 09:30:14.754481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.754508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.754628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.754647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.754755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.754773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:51.983 [2024-11-29 09:30:14.754852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.754868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:51.983 #35 NEW cov: 11733 ft: 14175 corp: 30/246b lim: 10 exec/s: 35 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:06:51.983 [2024-11-29 09:30:14.793969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00001e0a cdw11:00000000 00:06:51.983 [2024-11-29 09:30:14.793996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:51.984 #36 NEW cov: 11733 ft: 14212 corp: 31/248b lim: 10 exec/s: 36 rss: 69Mb L: 2/10 MS: 1 ChangeBit- 00:06:52.244 [2024-11-29 09:30:14.834575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.834609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.834718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.834737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.834847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.834863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.244 #37 NEW cov: 11733 ft: 14221 corp: 32/254b lim: 10 exec/s: 37 rss: 69Mb L: 6/10 MS: 1 EraseBytes- 00:06:52.244 [2024-11-29 09:30:14.875040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.875071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.875183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c901 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.875199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.875314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.875330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.875435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.875452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.875556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.875571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.244 #38 NEW cov: 11733 ft: 14227 corp: 33/264b lim: 10 exec/s: 38 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:06:52.244 [2024-11-29 09:30:14.925298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.925327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.925430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.925446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.925549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.925565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.925673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.925690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.925807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.925826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.244 #39 NEW cov: 11733 ft: 14273 corp: 34/274b lim: 10 exec/s: 39 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:52.244 [2024-11-29 09:30:14.965310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.965336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.965443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.965459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.965576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000a00 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.965593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.965705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.965722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:14.965828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:14.965843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.244 #40 NEW cov: 11733 ft: 14276 corp: 35/284b lim: 10 exec/s: 40 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:06:52.244 [2024-11-29 09:30:15.005253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.005278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:15.005387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.005403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:15.005513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.005529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:15.005618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00006ac9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.005633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.244 #41 NEW cov: 11733 ft: 14288 corp: 36/292b lim: 10 exec/s: 41 rss: 70Mb L: 8/10 MS: 1 ShuffleBytes- 00:06:52.244 [2024-11-29 09:30:15.045357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000cbc9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.045384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:15.045486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.045503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:15.045613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.244 [2024-11-29 09:30:15.045630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.244 [2024-11-29 09:30:15.045730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000e00 cdw11:00000000 00:06:52.245 [2024-11-29 09:30:15.045748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.245 #42 NEW cov: 11733 ft: 14314 corp: 37/300b lim: 10 exec/s: 42 rss: 70Mb L: 8/10 MS: 1 CMP- DE: "\016\000"- 00:06:52.245 [2024-11-29 09:30:15.085647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.245 [2024-11-29 09:30:15.085674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.245 [2024-11-29 09:30:15.085795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9ff cdw11:00000000 00:06:52.245 [2024-11-29 09:30:15.085814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.245 [2024-11-29 09:30:15.085922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:52.245 [2024-11-29 09:30:15.085942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.245 [2024-11-29 09:30:15.086053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffc9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.086071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.086182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.086199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.504 #43 NEW cov: 11733 ft: 14322 corp: 38/310b lim: 10 exec/s: 43 rss: 70Mb L: 10/10 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:06:52.504 [2024-11-29 09:30:15.125400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.125428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.125539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.125558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.125669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.125685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.504 #44 NEW cov: 11733 ft: 14358 corp: 39/317b lim: 10 exec/s: 44 rss: 70Mb L: 7/10 MS: 1 EraseBytes- 00:06:52.504 [2024-11-29 09:30:15.165921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.165947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.166015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.166032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.166145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.166160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.166264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.166282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.166395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.166413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.504 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:52.504 #45 NEW cov: 11756 ft: 14416 corp: 40/327b lim: 10 exec/s: 45 rss: 70Mb L: 10/10 MS: 1 ChangeBinInt- 00:06:52.504 [2024-11-29 09:30:15.206004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.206030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.206138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.206151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.206256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.206271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.206380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.206397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.206511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.206526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.504 #46 NEW cov: 11756 ft: 14438 corp: 41/337b lim: 10 exec/s: 46 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:06:52.504 [2024-11-29 09:30:15.246150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.246176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.246287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.246303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.246425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000c9c9 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.246440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.246552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000a00 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.246568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:52.504 [2024-11-29 09:30:15.246685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000c9e5 cdw11:00000000 00:06:52.504 [2024-11-29 09:30:15.246702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:52.504 #47 NEW cov: 11756 ft: 14451 corp: 42/347b lim: 10 exec/s: 23 rss: 70Mb L: 10/10 MS: 1 ShuffleBytes- 00:06:52.504 #47 DONE cov: 11756 ft: 14451 corp: 42/347b lim: 10 exec/s: 23 rss: 70Mb 00:06:52.504 ###### Recommended dictionary. ###### 00:06:52.504 "\377\377\377\377" # Uses: 1 00:06:52.504 "\016\000" # Uses: 0 00:06:52.504 ###### End of recommended dictionary. ###### 00:06:52.504 Done 47 runs in 2 second(s) 00:06:52.763 09:30:15 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:06:52.763 09:30:15 -- ../common.sh@72 -- # (( i++ )) 00:06:52.763 09:30:15 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:52.763 09:30:15 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:06:52.763 09:30:15 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:06:52.763 09:30:15 -- nvmf/run.sh@24 -- # local timen=1 00:06:52.763 09:30:15 -- nvmf/run.sh@25 -- # local core=0x1 00:06:52.763 09:30:15 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:52.763 09:30:15 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:06:52.763 09:30:15 -- nvmf/run.sh@29 -- # printf %02d 7 00:06:52.763 09:30:15 -- nvmf/run.sh@29 -- # port=4407 00:06:52.763 09:30:15 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:52.763 09:30:15 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:06:52.763 09:30:15 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:52.763 09:30:15 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:06:52.763 [2024-11-29 09:30:15.441644] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:52.763 [2024-11-29 09:30:15.441712] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3181913 ] 00:06:52.763 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.022 [2024-11-29 09:30:15.617948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.022 [2024-11-29 09:30:15.681653] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:53.022 [2024-11-29 09:30:15.681796] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.022 [2024-11-29 09:30:15.739800] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:53.022 [2024-11-29 09:30:15.756124] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:06:53.022 INFO: Running with entropic power schedule (0xFF, 100). 00:06:53.022 INFO: Seed: 1334636595 00:06:53.022 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:53.022 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:53.022 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:06:53.022 INFO: A corpus is not provided, starting from an empty corpus 00:06:53.022 #2 INITED exec/s: 0 rss: 60Mb 00:06:53.022 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:06:53.022 This may also happen if the target rejected all inputs we tried so far 00:06:53.022 [2024-11-29 09:30:15.811396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:53.022 [2024-11-29 09:30:15.811424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.280 NEW_FUNC[1/669]: 0x445d08 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:06:53.280 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:06:53.280 #3 NEW cov: 11529 ft: 11528 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:06:53.280 [2024-11-29 09:30:16.112052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:53.280 [2024-11-29 09:30:16.112083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.540 #4 NEW cov: 11642 ft: 11989 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ShuffleBytes- 00:06:53.540 [2024-11-29 09:30:16.152353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008484 cdw11:00000000 00:06:53.540 [2024-11-29 09:30:16.152379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.540 [2024-11-29 09:30:16.152432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008484 cdw11:00000000 00:06:53.540 [2024-11-29 09:30:16.152445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.540 [2024-11-29 09:30:16.152494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008484 cdw11:00000000 00:06:53.540 [2024-11-29 09:30:16.152510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.540 #5 NEW cov: 11648 ft: 12519 corp: 4/12b lim: 10 exec/s: 0 rss: 68Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:06:53.540 [2024-11-29 09:30:16.192204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:06:53.540 [2024-11-29 09:30:16.192230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.540 #6 NEW cov: 11733 ft: 12811 corp: 5/14b lim: 10 exec/s: 0 rss: 68Mb L: 2/7 MS: 1 InsertByte- 00:06:53.540 [2024-11-29 09:30:16.232312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bb7b cdw11:00000000 00:06:53.540 [2024-11-29 09:30:16.232338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.540 #7 NEW cov: 11733 ft: 12931 corp: 6/17b lim: 10 exec/s: 0 rss: 68Mb L: 3/7 MS: 1 InsertByte- 00:06:53.541 [2024-11-29 09:30:16.272825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004949 cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.272851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.541 [2024-11-29 09:30:16.272902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00004949 cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.272916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.541 [2024-11-29 09:30:16.272967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004949 cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.272980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.541 [2024-11-29 09:30:16.273031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000490a cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.273045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.541 #8 NEW cov: 11733 ft: 13243 corp: 7/25b lim: 10 exec/s: 0 rss: 68Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:06:53.541 [2024-11-29 09:30:16.312657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.312682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.541 [2024-11-29 09:30:16.312733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.312745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.541 #9 NEW cov: 11733 ft: 13441 corp: 8/29b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 1 CrossOver- 00:06:53.541 [2024-11-29 09:30:16.352665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:53.541 [2024-11-29 09:30:16.352689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.541 #10 NEW cov: 11733 ft: 13547 corp: 9/32b lim: 10 exec/s: 0 rss: 68Mb L: 3/8 MS: 1 ChangeBinInt- 00:06:53.801 [2024-11-29 09:30:16.393007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.393033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.393085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.393098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.393151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.393180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.801 #11 NEW cov: 11733 ft: 13588 corp: 10/38b lim: 10 exec/s: 0 rss: 68Mb L: 6/8 MS: 1 CMP- DE: "\001\002\000\000"- 00:06:53.801 [2024-11-29 09:30:16.432993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.433017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.433066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000250a cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.433079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.801 #12 NEW cov: 11733 ft: 13704 corp: 11/42b lim: 10 exec/s: 0 rss: 68Mb L: 4/8 MS: 1 InsertByte- 00:06:53.801 [2024-11-29 09:30:16.473038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bb0a cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.473063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 #13 NEW cov: 11733 ft: 13750 corp: 12/44b lim: 10 exec/s: 0 rss: 68Mb L: 2/8 MS: 1 EraseBytes- 00:06:53.801 [2024-11-29 09:30:16.513344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.513369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.513420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.513433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.513483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000bb7b cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.513496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.801 #14 NEW cov: 11733 ft: 13802 corp: 13/51b lim: 10 exec/s: 0 rss: 68Mb L: 7/8 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:53.801 [2024-11-29 09:30:16.553609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.553634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.553688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000102 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.553702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.553753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.553766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.553817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000250a cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.553829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:53.801 #15 NEW cov: 11733 ft: 13829 corp: 14/59b lim: 10 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:53.801 [2024-11-29 09:30:16.593490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003bbb cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.593518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.593570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.593583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:53.801 #16 NEW cov: 11733 ft: 13836 corp: 15/63b lim: 10 exec/s: 0 rss: 69Mb L: 4/8 MS: 1 InsertByte- 00:06:53.801 [2024-11-29 09:30:16.623578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000026 cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.623607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:53.801 [2024-11-29 09:30:16.623658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000250a cdw11:00000000 00:06:53.801 [2024-11-29 09:30:16.623672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.061 #17 NEW cov: 11733 ft: 13851 corp: 16/67b lim: 10 exec/s: 0 rss: 69Mb L: 4/8 MS: 1 ChangeByte- 00:06:54.061 [2024-11-29 09:30:16.663831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.663856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.061 [2024-11-29 09:30:16.663908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.663922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.061 [2024-11-29 09:30:16.663971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00007b0a cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.664000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.061 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:54.061 #18 NEW cov: 11756 ft: 13893 corp: 17/73b lim: 10 exec/s: 0 rss: 69Mb L: 6/8 MS: 1 CopyPart- 00:06:54.061 [2024-11-29 09:30:16.703849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.703874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.061 [2024-11-29 09:30:16.703924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000250a cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.703937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.061 #19 NEW cov: 11756 ft: 13913 corp: 18/77b lim: 10 exec/s: 0 rss: 69Mb L: 4/8 MS: 1 CrossOver- 00:06:54.061 [2024-11-29 09:30:16.743950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002a03 cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.743974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.061 [2024-11-29 09:30:16.744026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000250a cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.744040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.061 #20 NEW cov: 11756 ft: 13922 corp: 19/81b lim: 10 exec/s: 0 rss: 69Mb L: 4/8 MS: 1 ChangeByte- 00:06:54.061 [2024-11-29 09:30:16.773890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.773914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.061 #21 NEW cov: 11756 ft: 13997 corp: 20/83b lim: 10 exec/s: 21 rss: 69Mb L: 2/8 MS: 1 ShuffleBytes- 00:06:54.061 [2024-11-29 09:30:16.814031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:00000000 00:06:54.061 [2024-11-29 09:30:16.814056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.061 #22 NEW cov: 11756 ft: 14011 corp: 21/86b lim: 10 exec/s: 22 rss: 69Mb L: 3/8 MS: 1 ShuffleBytes- 00:06:54.062 [2024-11-29 09:30:16.854127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fb0a cdw11:00000000 00:06:54.062 [2024-11-29 09:30:16.854152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.062 #23 NEW cov: 11756 ft: 14030 corp: 22/88b lim: 10 exec/s: 23 rss: 69Mb L: 2/8 MS: 1 ChangeBit- 00:06:54.062 [2024-11-29 09:30:16.894493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008484 cdw11:00000000 00:06:54.062 [2024-11-29 09:30:16.894518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.062 [2024-11-29 09:30:16.894570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008484 cdw11:00000000 00:06:54.062 [2024-11-29 09:30:16.894584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.062 [2024-11-29 09:30:16.894639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008484 cdw11:00000000 00:06:54.062 [2024-11-29 09:30:16.894653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.321 #24 NEW cov: 11756 ft: 14044 corp: 23/95b lim: 10 exec/s: 24 rss: 69Mb L: 7/8 MS: 1 ShuffleBytes- 00:06:54.321 [2024-11-29 09:30:16.934758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:16.934783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:16.934834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000a3 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:16.934847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:16.934897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a3a3 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:16.934925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:16.934976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000007b cdw11:00000000 00:06:54.321 [2024-11-29 09:30:16.934988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.321 #25 NEW cov: 11756 ft: 14068 corp: 24/104b lim: 10 exec/s: 25 rss: 69Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:06:54.321 [2024-11-29 09:30:16.974557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:16.974582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:16.974653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000250a cdw11:00000000 00:06:54.321 [2024-11-29 09:30:16.974667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.321 #26 NEW cov: 11756 ft: 14156 corp: 25/108b lim: 10 exec/s: 26 rss: 69Mb L: 4/9 MS: 1 ShuffleBytes- 00:06:54.321 [2024-11-29 09:30:17.014842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.014866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.014916] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.014929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.014979] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000300 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.014992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.321 #27 NEW cov: 11756 ft: 14176 corp: 26/115b lim: 10 exec/s: 27 rss: 69Mb L: 7/9 MS: 1 PersAutoDict- DE: "\001\002\000\000"- 00:06:54.321 [2024-11-29 09:30:17.054945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008484 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.054969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.055021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005884 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.055034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.055085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008484 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.055098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.321 #28 NEW cov: 11756 ft: 14194 corp: 27/122b lim: 10 exec/s: 28 rss: 69Mb L: 7/9 MS: 1 ChangeByte- 00:06:54.321 [2024-11-29 09:30:17.095160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.095184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.095236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.095249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.095301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000026 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.095313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.095365] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000250a cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.095377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.321 #29 NEW cov: 11756 ft: 14226 corp: 28/130b lim: 10 exec/s: 29 rss: 69Mb L: 8/9 MS: 1 CMP- DE: "\000\000\000\000"- 00:06:54.321 [2024-11-29 09:30:17.135185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.135209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.135259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000025e5 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.135272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.321 [2024-11-29 09:30:17.135321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e5e5 cdw11:00000000 00:06:54.321 [2024-11-29 09:30:17.135339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.321 #30 NEW cov: 11756 ft: 14301 corp: 29/137b lim: 10 exec/s: 30 rss: 69Mb L: 7/9 MS: 1 InsertRepeatedBytes- 00:06:54.580 [2024-11-29 09:30:17.175426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:54.580 [2024-11-29 09:30:17.175450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.580 [2024-11-29 09:30:17.175501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:54.580 [2024-11-29 09:30:17.175514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.580 [2024-11-29 09:30:17.175565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:06:54.580 [2024-11-29 09:30:17.175578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.580 [2024-11-29 09:30:17.175632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000bb0a cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.175645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:54.581 #31 NEW cov: 11756 ft: 14324 corp: 30/145b lim: 10 exec/s: 31 rss: 69Mb L: 8/9 MS: 1 InsertRepeatedBytes- 00:06:54.581 [2024-11-29 09:30:17.215179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.215203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.581 #32 NEW cov: 11756 ft: 14397 corp: 31/148b lim: 10 exec/s: 32 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:06:54.581 [2024-11-29 09:30:17.255524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000325 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.255549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.581 [2024-11-29 09:30:17.255595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a00 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.255612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.581 [2024-11-29 09:30:17.255661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000325 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.255674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.581 #33 NEW cov: 11756 ft: 14402 corp: 32/155b lim: 10 exec/s: 33 rss: 70Mb L: 7/9 MS: 1 CopyPart- 00:06:54.581 [2024-11-29 09:30:17.295617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.295658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.581 [2024-11-29 09:30:17.295709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000026 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.295722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.581 [2024-11-29 09:30:17.295770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000250a cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.295783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.581 #34 NEW cov: 11756 ft: 14414 corp: 33/161b lim: 10 exec/s: 34 rss: 70Mb L: 6/9 MS: 1 EraseBytes- 00:06:54.581 [2024-11-29 09:30:17.335660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.335686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.581 [2024-11-29 09:30:17.335737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00007a2c cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.335750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.581 #35 NEW cov: 11756 ft: 14430 corp: 34/165b lim: 10 exec/s: 35 rss: 70Mb L: 4/9 MS: 1 InsertByte- 00:06:54.581 [2024-11-29 09:30:17.375643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000bb7b cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.375667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.581 #36 NEW cov: 11756 ft: 14442 corp: 35/168b lim: 10 exec/s: 36 rss: 70Mb L: 3/9 MS: 1 ChangeBit- 00:06:54.581 [2024-11-29 09:30:17.405724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000fb0a cdw11:00000000 00:06:54.581 [2024-11-29 09:30:17.405748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.840 #37 NEW cov: 11756 ft: 14452 corp: 36/171b lim: 10 exec/s: 37 rss: 70Mb L: 3/9 MS: 1 InsertByte- 00:06:54.840 [2024-11-29 09:30:17.445864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a35 cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.445889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.840 #38 NEW cov: 11756 ft: 14504 corp: 37/173b lim: 10 exec/s: 38 rss: 70Mb L: 2/9 MS: 1 ChangeBinInt- 00:06:54.840 [2024-11-29 09:30:17.485946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.485970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.840 #39 NEW cov: 11756 ft: 14513 corp: 38/176b lim: 10 exec/s: 39 rss: 70Mb L: 3/9 MS: 1 ChangeBinInt- 00:06:54.840 [2024-11-29 09:30:17.526225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000005d cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.526249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.840 [2024-11-29 09:30:17.526300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000325 cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.526314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.840 #40 NEW cov: 11756 ft: 14516 corp: 39/181b lim: 10 exec/s: 40 rss: 70Mb L: 5/9 MS: 1 InsertByte- 00:06:54.840 [2024-11-29 09:30:17.556379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.556404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.840 [2024-11-29 09:30:17.556455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.556470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.840 [2024-11-29 09:30:17.556521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000031 cdw11:00000000 00:06:54.840 [2024-11-29 09:30:17.556535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:54.840 #41 NEW cov: 11756 ft: 14562 corp: 40/188b lim: 10 exec/s: 41 rss: 70Mb L: 7/9 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:06:54.841 [2024-11-29 09:30:17.596280] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:06:54.841 [2024-11-29 09:30:17.596304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.841 #42 NEW cov: 11756 ft: 14567 corp: 41/190b lim: 10 exec/s: 42 rss: 70Mb L: 2/9 MS: 1 CopyPart- 00:06:54.841 [2024-11-29 09:30:17.626455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.841 [2024-11-29 09:30:17.626479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.841 [2024-11-29 09:30:17.626532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:06:54.841 [2024-11-29 09:30:17.626545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:54.841 #43 NEW cov: 11756 ft: 14591 corp: 42/194b lim: 10 exec/s: 43 rss: 70Mb L: 4/9 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:06:54.841 [2024-11-29 09:30:17.656434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00003cf5 cdw11:00000000 00:06:54.841 [2024-11-29 09:30:17.656458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:54.841 #44 NEW cov: 11756 ft: 14594 corp: 43/196b lim: 10 exec/s: 44 rss: 70Mb L: 2/9 MS: 1 ChangeBinInt- 00:06:55.100 [2024-11-29 09:30:17.686946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000102 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.686971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.687025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000000a3 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.687039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.687090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a3a3 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.687104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.687155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000007b cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.687167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.100 #45 NEW cov: 11756 ft: 14624 corp: 44/205b lim: 10 exec/s: 45 rss: 70Mb L: 9/9 MS: 1 CrossOver- 00:06:55.100 [2024-11-29 09:30:17.727034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00008484 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.727060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.727112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00008484 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.727126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.727178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00008484 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.727191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.727240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ff0b cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.727252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:55.100 #46 NEW cov: 11756 ft: 14633 corp: 45/214b lim: 10 exec/s: 46 rss: 70Mb L: 9/9 MS: 1 CMP- DE: "\377\013"- 00:06:55.100 [2024-11-29 09:30:17.766789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a31 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.766814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.100 #47 NEW cov: 11756 ft: 14636 corp: 46/217b lim: 10 exec/s: 47 rss: 70Mb L: 3/9 MS: 1 InsertByte- 00:06:55.100 [2024-11-29 09:30:17.796966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000325 cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.796991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.100 [2024-11-29 09:30:17.797043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:06:55.100 [2024-11-29 09:30:17.797056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.100 #48 NEW cov: 11756 ft: 14641 corp: 47/221b lim: 10 exec/s: 24 rss: 70Mb L: 4/9 MS: 1 ShuffleBytes- 00:06:55.100 #48 DONE cov: 11756 ft: 14641 corp: 47/221b lim: 10 exec/s: 24 rss: 70Mb 00:06:55.100 ###### Recommended dictionary. ###### 00:06:55.100 "\001\002\000\000" # Uses: 3 00:06:55.100 "\000\000\000\000" # Uses: 2 00:06:55.100 "\377\013" # Uses: 0 00:06:55.100 ###### End of recommended dictionary. ###### 00:06:55.100 Done 48 runs in 2 second(s) 00:06:55.100 09:30:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:06:55.100 09:30:17 -- ../common.sh@72 -- # (( i++ )) 00:06:55.100 09:30:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:55.100 09:30:17 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:06:55.100 09:30:17 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:06:55.100 09:30:17 -- nvmf/run.sh@24 -- # local timen=1 00:06:55.100 09:30:17 -- nvmf/run.sh@25 -- # local core=0x1 00:06:55.100 09:30:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:55.100 09:30:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:06:55.100 09:30:17 -- nvmf/run.sh@29 -- # printf %02d 8 00:06:55.359 09:30:17 -- nvmf/run.sh@29 -- # port=4408 00:06:55.359 09:30:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:55.359 09:30:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:06:55.359 09:30:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:55.359 09:30:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:06:55.359 [2024-11-29 09:30:17.980086] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:55.359 [2024-11-29 09:30:17.980156] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3182309 ] 00:06:55.359 EAL: No free 2048 kB hugepages reported on node 1 00:06:55.359 [2024-11-29 09:30:18.165035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.618 [2024-11-29 09:30:18.229565] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:55.618 [2024-11-29 09:30:18.229715] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.618 [2024-11-29 09:30:18.287669] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:55.618 [2024-11-29 09:30:18.304044] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:06:55.618 INFO: Running with entropic power schedule (0xFF, 100). 00:06:55.618 INFO: Seed: 3882644364 00:06:55.618 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:55.618 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:55.618 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:06:55.618 INFO: A corpus is not provided, starting from an empty corpus 00:06:55.618 [2024-11-29 09:30:18.359247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.618 [2024-11-29 09:30:18.359277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.618 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 66Mb 00:06:55.619 [2024-11-29 09:30:18.389196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.619 [2024-11-29 09:30:18.389221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.619 #3 NEW cov: 11670 ft: 11974 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 CrossOver- 00:06:55.619 [2024-11-29 09:30:18.429481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.619 [2024-11-29 09:30:18.429506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.619 [2024-11-29 09:30:18.429562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.619 [2024-11-29 09:30:18.429576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.619 #4 NEW cov: 11676 ft: 12752 corp: 3/4b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 InsertByte- 00:06:55.878 [2024-11-29 09:30:18.469394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.469420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.878 #5 NEW cov: 11761 ft: 13072 corp: 4/5b lim: 5 exec/s: 0 rss: 67Mb L: 1/2 MS: 1 ChangeByte- 00:06:55.878 [2024-11-29 09:30:18.509720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.509745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.509801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.509814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.878 #6 NEW cov: 11761 ft: 13320 corp: 5/7b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 ChangeBinInt- 00:06:55.878 [2024-11-29 09:30:18.549953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.549978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.550033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.550047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.550101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.550118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.878 #7 NEW cov: 11761 ft: 13531 corp: 6/10b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:06:55.878 [2024-11-29 09:30:18.590060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.590085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.590142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.590155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.590211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.590224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.878 #8 NEW cov: 11761 ft: 13587 corp: 7/13b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeByte- 00:06:55.878 [2024-11-29 09:30:18.630214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.630240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.630298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.630311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.630366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.630379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:55.878 #9 NEW cov: 11761 ft: 13633 corp: 8/16b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeByte- 00:06:55.878 [2024-11-29 09:30:18.670220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.670244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.878 [2024-11-29 09:30:18.670264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.878 [2024-11-29 09:30:18.670274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:55.878 #10 NEW cov: 11761 ft: 13645 corp: 9/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 InsertByte- 00:06:55.878 [2024-11-29 09:30:18.710291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.879 [2024-11-29 09:30:18.710316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:55.879 [2024-11-29 09:30:18.710370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:55.879 [2024-11-29 09:30:18.710384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.137 #11 NEW cov: 11761 ft: 13725 corp: 10/20b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CopyPart- 00:06:56.137 [2024-11-29 09:30:18.750187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.137 [2024-11-29 09:30:18.750212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.137 #12 NEW cov: 11761 ft: 13769 corp: 11/21b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 EraseBytes- 00:06:56.137 [2024-11-29 09:30:18.790469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.137 [2024-11-29 09:30:18.790494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.137 [2024-11-29 09:30:18.790551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.137 [2024-11-29 09:30:18.790565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.137 #13 NEW cov: 11761 ft: 13798 corp: 12/23b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeBinInt- 00:06:56.137 [2024-11-29 09:30:18.830767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.137 [2024-11-29 09:30:18.830792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.137 [2024-11-29 09:30:18.830847] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.137 [2024-11-29 09:30:18.830861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.137 [2024-11-29 09:30:18.830917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.137 [2024-11-29 09:30:18.830930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.137 #14 NEW cov: 11761 ft: 13925 corp: 13/26b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ChangeByte- 00:06:56.137 [2024-11-29 09:30:18.870560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.138 [2024-11-29 09:30:18.870585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.138 #15 NEW cov: 11761 ft: 14002 corp: 14/27b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeByte- 00:06:56.138 [2024-11-29 09:30:18.910982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.138 [2024-11-29 09:30:18.911007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.138 [2024-11-29 09:30:18.911061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.138 [2024-11-29 09:30:18.911074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.138 [2024-11-29 09:30:18.911131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.138 [2024-11-29 09:30:18.911144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.138 #16 NEW cov: 11761 ft: 14034 corp: 15/30b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 ShuffleBytes- 00:06:56.138 [2024-11-29 09:30:18.950936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.138 [2024-11-29 09:30:18.950960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.138 [2024-11-29 09:30:18.951018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.138 [2024-11-29 09:30:18.951032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.138 #17 NEW cov: 11761 ft: 14040 corp: 16/32b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 EraseBytes- 00:06:56.396 [2024-11-29 09:30:18.990863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.396 [2024-11-29 09:30:18.990887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.396 #18 NEW cov: 11761 ft: 14076 corp: 17/33b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeByte- 00:06:56.396 [2024-11-29 09:30:19.020958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.396 [2024-11-29 09:30:19.020983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.396 #19 NEW cov: 11761 ft: 14101 corp: 18/34b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeBit- 00:06:56.396 [2024-11-29 09:30:19.061068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.396 [2024-11-29 09:30:19.061093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.396 #20 NEW cov: 11761 ft: 14177 corp: 19/35b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 CrossOver- 00:06:56.396 [2024-11-29 09:30:19.101390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.396 [2024-11-29 09:30:19.101415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.396 [2024-11-29 09:30:19.101475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.396 [2024-11-29 09:30:19.101488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.396 #21 NEW cov: 11761 ft: 14213 corp: 20/37b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeBit- 00:06:56.396 [2024-11-29 09:30:19.141314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.396 [2024-11-29 09:30:19.141338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.396 #22 NEW cov: 11761 ft: 14231 corp: 21/38b lim: 5 exec/s: 0 rss: 68Mb L: 1/3 MS: 1 ChangeBit- 00:06:56.397 [2024-11-29 09:30:19.181944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.397 [2024-11-29 09:30:19.181970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.397 [2024-11-29 09:30:19.182043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.397 [2024-11-29 09:30:19.182056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.397 [2024-11-29 09:30:19.182114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.397 [2024-11-29 09:30:19.182128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.397 [2024-11-29 09:30:19.182182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.397 [2024-11-29 09:30:19.182196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.397 #23 NEW cov: 11761 ft: 14554 corp: 22/42b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:06:56.397 [2024-11-29 09:30:19.221751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.397 [2024-11-29 09:30:19.221776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.397 [2024-11-29 09:30:19.221849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.397 [2024-11-29 09:30:19.221863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.913 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:56.913 #24 NEW cov: 11784 ft: 14619 corp: 23/44b lim: 5 exec/s: 24 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:06:56.913 [2024-11-29 09:30:19.522814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.522846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.522904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.522918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.522972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.522985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.523038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.523051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:56.913 #25 NEW cov: 11784 ft: 14638 corp: 24/48b lim: 5 exec/s: 25 rss: 69Mb L: 4/4 MS: 1 CrossOver- 00:06:56.913 [2024-11-29 09:30:19.562549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.562575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.562648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.562663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.913 #26 NEW cov: 11784 ft: 14641 corp: 25/50b lim: 5 exec/s: 26 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:06:56.913 [2024-11-29 09:30:19.602893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.602921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.602995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.603009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.603065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.603078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:56.913 #27 NEW cov: 11784 ft: 14648 corp: 26/53b lim: 5 exec/s: 27 rss: 69Mb L: 3/4 MS: 1 ShuffleBytes- 00:06:56.913 [2024-11-29 09:30:19.642652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.642677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.913 #28 NEW cov: 11784 ft: 14693 corp: 27/54b lim: 5 exec/s: 28 rss: 69Mb L: 1/4 MS: 1 CopyPart- 00:06:56.913 [2024-11-29 09:30:19.672929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.672954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.913 [2024-11-29 09:30:19.673025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.913 [2024-11-29 09:30:19.673039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:56.913 #29 NEW cov: 11784 ft: 14717 corp: 28/56b lim: 5 exec/s: 29 rss: 69Mb L: 2/4 MS: 1 ChangeByte- 00:06:56.913 [2024-11-29 09:30:19.712869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.914 [2024-11-29 09:30:19.712894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:56.914 #30 NEW cov: 11784 ft: 14745 corp: 29/57b lim: 5 exec/s: 30 rss: 69Mb L: 1/4 MS: 1 EraseBytes- 00:06:56.914 [2024-11-29 09:30:19.742921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:56.914 [2024-11-29 09:30:19.742945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 #31 NEW cov: 11784 ft: 14767 corp: 30/58b lim: 5 exec/s: 31 rss: 69Mb L: 1/4 MS: 1 ChangeByte- 00:06:57.172 [2024-11-29 09:30:19.783197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.783223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.783281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.783295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.172 #32 NEW cov: 11784 ft: 14782 corp: 31/60b lim: 5 exec/s: 32 rss: 69Mb L: 2/4 MS: 1 ChangeBit- 00:06:57.172 [2024-11-29 09:30:19.823193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.823218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 #33 NEW cov: 11784 ft: 14809 corp: 32/61b lim: 5 exec/s: 33 rss: 69Mb L: 1/4 MS: 1 ChangeBit- 00:06:57.172 [2024-11-29 09:30:19.853262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.853287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 #34 NEW cov: 11784 ft: 14836 corp: 33/62b lim: 5 exec/s: 34 rss: 69Mb L: 1/4 MS: 1 ShuffleBytes- 00:06:57.172 [2024-11-29 09:30:19.893984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.894010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.894084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.894098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.894153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.894166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.894219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.894233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.894286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.894299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.172 #35 NEW cov: 11784 ft: 14891 corp: 34/67b lim: 5 exec/s: 35 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:06:57.172 [2024-11-29 09:30:19.933798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.933823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.933895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.933908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.933962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.933975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.172 #36 NEW cov: 11784 ft: 14897 corp: 35/70b lim: 5 exec/s: 36 rss: 70Mb L: 3/5 MS: 1 CrossOver- 00:06:57.172 [2024-11-29 09:30:19.973917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.973944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.974015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.974028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:19.974083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:19.974095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.172 #37 NEW cov: 11784 ft: 14904 corp: 36/73b lim: 5 exec/s: 37 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:06:57.172 [2024-11-29 09:30:20.013885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:20.013912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.172 [2024-11-29 09:30:20.013971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.172 [2024-11-29 09:30:20.013987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.432 #38 NEW cov: 11784 ft: 14957 corp: 37/75b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:06:57.432 [2024-11-29 09:30:20.054480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.054508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.054563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.054577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.054635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.054649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.054701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.054715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.054765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.054779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:06:57.432 #39 NEW cov: 11784 ft: 14985 corp: 38/80b lim: 5 exec/s: 39 rss: 70Mb L: 5/5 MS: 1 ChangeBinInt- 00:06:57.432 [2024-11-29 09:30:20.104317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.104346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.104399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.104416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.104469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.104483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.432 #40 NEW cov: 11784 ft: 14996 corp: 39/83b lim: 5 exec/s: 40 rss: 70Mb L: 3/5 MS: 1 CopyPart- 00:06:57.432 [2024-11-29 09:30:20.144088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.144114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.432 #41 NEW cov: 11784 ft: 15001 corp: 40/84b lim: 5 exec/s: 41 rss: 70Mb L: 1/5 MS: 1 ChangeBit- 00:06:57.432 [2024-11-29 09:30:20.184348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.184373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.184429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.184443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.432 #42 NEW cov: 11784 ft: 15021 corp: 41/86b lim: 5 exec/s: 42 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:06:57.432 [2024-11-29 09:30:20.224801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.224826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.224881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.224894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.224945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.224959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.225011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.225025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:57.432 #43 NEW cov: 11784 ft: 15029 corp: 42/90b lim: 5 exec/s: 43 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:06:57.432 [2024-11-29 09:30:20.264633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.264659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.432 [2024-11-29 09:30:20.264714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.432 [2024-11-29 09:30:20.264727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.692 #44 NEW cov: 11784 ft: 15034 corp: 43/92b lim: 5 exec/s: 44 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:06:57.692 [2024-11-29 09:30:20.304898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.692 [2024-11-29 09:30:20.304923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.692 [2024-11-29 09:30:20.304978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.692 [2024-11-29 09:30:20.304992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.692 [2024-11-29 09:30:20.305046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.692 [2024-11-29 09:30:20.305060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:57.692 #45 NEW cov: 11784 ft: 15042 corp: 44/95b lim: 5 exec/s: 45 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:06:57.692 [2024-11-29 09:30:20.344857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.692 [2024-11-29 09:30:20.344883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:57.692 [2024-11-29 09:30:20.344938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:57.692 [2024-11-29 09:30:20.344952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:57.692 #46 NEW cov: 11784 ft: 15062 corp: 45/97b lim: 5 exec/s: 23 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:06:57.692 #46 DONE cov: 11784 ft: 15062 corp: 45/97b lim: 5 exec/s: 23 rss: 70Mb 00:06:57.692 Done 46 runs in 2 second(s) 00:06:57.692 09:30:20 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:06:57.692 09:30:20 -- ../common.sh@72 -- # (( i++ )) 00:06:57.692 09:30:20 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:06:57.692 09:30:20 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:06:57.692 09:30:20 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:06:57.692 09:30:20 -- nvmf/run.sh@24 -- # local timen=1 00:06:57.692 09:30:20 -- nvmf/run.sh@25 -- # local core=0x1 00:06:57.692 09:30:20 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:06:57.692 09:30:20 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:06:57.692 09:30:20 -- nvmf/run.sh@29 -- # printf %02d 9 00:06:57.692 09:30:20 -- nvmf/run.sh@29 -- # port=4409 00:06:57.692 09:30:20 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:06:57.692 09:30:20 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:06:57.692 09:30:20 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:06:57.692 09:30:20 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:06:57.953 [2024-11-29 09:30:20.538428] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:57.953 [2024-11-29 09:30:20.538515] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3182750 ] 00:06:57.953 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.953 [2024-11-29 09:30:20.788110] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.213 [2024-11-29 09:30:20.878230] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:58.213 [2024-11-29 09:30:20.878372] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.213 [2024-11-29 09:30:20.936442] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:58.213 [2024-11-29 09:30:20.952817] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:06:58.213 INFO: Running with entropic power schedule (0xFF, 100). 00:06:58.213 INFO: Seed: 2234666788 00:06:58.213 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:06:58.213 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:06:58.213 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:06:58.213 INFO: A corpus is not provided, starting from an empty corpus 00:06:58.213 [2024-11-29 09:30:21.019636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.213 [2024-11-29 09:30:21.019674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.213 #2 INITED cov: 11543 ft: 11555 corp: 1/1b exec/s: 0 rss: 67Mb 00:06:58.471 [2024-11-29 09:30:21.070018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.471 [2024-11-29 09:30:21.070048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.471 [2024-11-29 09:30:21.070126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.471 [2024-11-29 09:30:21.070142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.730 NEW_FUNC[1/1]: 0x194d438 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:06:58.730 #3 NEW cov: 11670 ft: 12769 corp: 2/3b lim: 5 exec/s: 0 rss: 69Mb L: 2/2 MS: 1 InsertByte- 00:06:58.730 [2024-11-29 09:30:21.400606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.400658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.730 [2024-11-29 09:30:21.400808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.400832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.730 [2024-11-29 09:30:21.400961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.400983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.730 #4 NEW cov: 11676 ft: 13306 corp: 3/6b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 CrossOver- 00:06:58.730 [2024-11-29 09:30:21.450305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.450335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.730 [2024-11-29 09:30:21.450467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.450485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.730 #5 NEW cov: 11761 ft: 13575 corp: 4/8b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 ChangeBinInt- 00:06:58.730 [2024-11-29 09:30:21.490104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.490133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.730 #6 NEW cov: 11761 ft: 13736 corp: 5/9b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 ChangeByte- 00:06:58.730 [2024-11-29 09:30:21.530868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.530895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.730 [2024-11-29 09:30:21.531019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.531036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.730 [2024-11-29 09:30:21.531154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.531171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.730 #7 NEW cov: 11761 ft: 13786 corp: 6/12b lim: 5 exec/s: 0 rss: 69Mb L: 3/3 MS: 1 InsertByte- 00:06:58.730 [2024-11-29 09:30:21.570430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.730 [2024-11-29 09:30:21.570457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.991 #8 NEW cov: 11761 ft: 13877 corp: 7/13b lim: 5 exec/s: 0 rss: 69Mb L: 1/3 MS: 1 EraseBytes- 00:06:58.991 [2024-11-29 09:30:21.610742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.610769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.610888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.610904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.991 #9 NEW cov: 11761 ft: 13906 corp: 8/15b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CrossOver- 00:06:58.991 [2024-11-29 09:30:21.650886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.650911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.651030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.651059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.991 #10 NEW cov: 11761 ft: 13946 corp: 9/17b lim: 5 exec/s: 0 rss: 69Mb L: 2/3 MS: 1 CopyPart- 00:06:58.991 [2024-11-29 09:30:21.691554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.691580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.691710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.691726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.691842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.691858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.691976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.691991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.991 #11 NEW cov: 11761 ft: 14250 corp: 10/21b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 CrossOver- 00:06:58.991 [2024-11-29 09:30:21.741714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.741739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.741862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.741878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.741998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.742014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.742133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.742149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:58.991 #12 NEW cov: 11761 ft: 14321 corp: 11/25b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 CopyPart- 00:06:58.991 [2024-11-29 09:30:21.781524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.781552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.781663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.781680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:58.991 [2024-11-29 09:30:21.781796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.781812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:58.991 #13 NEW cov: 11761 ft: 14332 corp: 12/28b lim: 5 exec/s: 0 rss: 69Mb L: 3/4 MS: 1 EraseBytes- 00:06:58.991 [2024-11-29 09:30:21.821196] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:58.991 [2024-11-29 09:30:21.821225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.251 #14 NEW cov: 11761 ft: 14365 corp: 13/29b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 ChangeASCIIInt- 00:06:59.251 [2024-11-29 09:30:21.861508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.251 [2024-11-29 09:30:21.861535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.251 [2024-11-29 09:30:21.861644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.251 [2024-11-29 09:30:21.861660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.251 #15 NEW cov: 11761 ft: 14385 corp: 14/31b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 InsertByte- 00:06:59.251 [2024-11-29 09:30:21.901672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.251 [2024-11-29 09:30:21.901700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.251 [2024-11-29 09:30:21.901819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.251 [2024-11-29 09:30:21.901848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.251 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:06:59.251 #16 NEW cov: 11784 ft: 14503 corp: 15/33b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 EraseBytes- 00:06:59.251 [2024-11-29 09:30:21.942362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.251 [2024-11-29 09:30:21.942388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.251 [2024-11-29 09:30:21.942506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:21.942523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:21.942652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:21.942668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:21.942778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:21.942794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:59.252 #17 NEW cov: 11784 ft: 14529 corp: 16/37b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:06:59.252 [2024-11-29 09:30:21.982164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:21.982191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:21.982312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:21.982327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:21.982450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:21.982465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.252 #18 NEW cov: 11784 ft: 14552 corp: 17/40b lim: 5 exec/s: 18 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:06:59.252 [2024-11-29 09:30:22.022344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:22.022371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:22.022494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:22.022510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:22.022626] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:22.022655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.252 #19 NEW cov: 11784 ft: 14625 corp: 18/43b lim: 5 exec/s: 19 rss: 70Mb L: 3/4 MS: 1 ChangeBinInt- 00:06:59.252 [2024-11-29 09:30:22.062114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:22.062140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.252 [2024-11-29 09:30:22.062266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.252 [2024-11-29 09:30:22.062282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.252 #20 NEW cov: 11784 ft: 14640 corp: 19/45b lim: 5 exec/s: 20 rss: 70Mb L: 2/4 MS: 1 ShuffleBytes- 00:06:59.512 [2024-11-29 09:30:22.102513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.102539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.512 [2024-11-29 09:30:22.102658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.102675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.512 [2024-11-29 09:30:22.102796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.102811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.512 #21 NEW cov: 11784 ft: 14724 corp: 20/48b lim: 5 exec/s: 21 rss: 70Mb L: 3/4 MS: 1 ShuffleBytes- 00:06:59.512 [2024-11-29 09:30:22.142391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.142418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.512 [2024-11-29 09:30:22.142529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.142550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.512 #22 NEW cov: 11784 ft: 14753 corp: 21/50b lim: 5 exec/s: 22 rss: 70Mb L: 2/4 MS: 1 ChangeBit- 00:06:59.512 [2024-11-29 09:30:22.182483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.182509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.512 [2024-11-29 09:30:22.182619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.182645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.512 #23 NEW cov: 11784 ft: 14820 corp: 22/52b lim: 5 exec/s: 23 rss: 70Mb L: 2/4 MS: 1 ShuffleBytes- 00:06:59.512 [2024-11-29 09:30:22.222379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.222405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.512 #24 NEW cov: 11784 ft: 14849 corp: 23/53b lim: 5 exec/s: 24 rss: 70Mb L: 1/4 MS: 1 ChangeByte- 00:06:59.512 [2024-11-29 09:30:22.263033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.512 [2024-11-29 09:30:22.263060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.513 [2024-11-29 09:30:22.263168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.513 [2024-11-29 09:30:22.263185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.513 [2024-11-29 09:30:22.263300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.513 [2024-11-29 09:30:22.263316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.513 #25 NEW cov: 11784 ft: 14869 corp: 24/56b lim: 5 exec/s: 25 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:06:59.513 [2024-11-29 09:30:22.302891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.513 [2024-11-29 09:30:22.302917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.513 [2024-11-29 09:30:22.303036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.513 [2024-11-29 09:30:22.303052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.513 #26 NEW cov: 11784 ft: 14875 corp: 25/58b lim: 5 exec/s: 26 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:06:59.513 [2024-11-29 09:30:22.343005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.513 [2024-11-29 09:30:22.343032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.513 [2024-11-29 09:30:22.343149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.513 [2024-11-29 09:30:22.343168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.773 #27 NEW cov: 11784 ft: 14898 corp: 26/60b lim: 5 exec/s: 27 rss: 70Mb L: 2/4 MS: 1 ShuffleBytes- 00:06:59.773 [2024-11-29 09:30:22.383291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.383319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.383437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.383454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.383572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.383591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.383716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.383733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:06:59.773 #28 NEW cov: 11784 ft: 14912 corp: 27/64b lim: 5 exec/s: 28 rss: 70Mb L: 4/4 MS: 1 ChangeByte- 00:06:59.773 [2024-11-29 09:30:22.423254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.423282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.423412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.423429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.773 #29 NEW cov: 11784 ft: 14945 corp: 28/66b lim: 5 exec/s: 29 rss: 70Mb L: 2/4 MS: 1 ChangeBinInt- 00:06:59.773 [2024-11-29 09:30:22.463164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.463192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.773 #30 NEW cov: 11784 ft: 14952 corp: 29/67b lim: 5 exec/s: 30 rss: 70Mb L: 1/4 MS: 1 EraseBytes- 00:06:59.773 [2024-11-29 09:30:22.503499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.503528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.503654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.503672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.773 #31 NEW cov: 11784 ft: 14958 corp: 30/69b lim: 5 exec/s: 31 rss: 70Mb L: 2/4 MS: 1 ShuffleBytes- 00:06:59.773 [2024-11-29 09:30:22.553972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.554000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.554123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.554141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.554260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.554276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:06:59.773 #32 NEW cov: 11784 ft: 14975 corp: 31/72b lim: 5 exec/s: 32 rss: 70Mb L: 3/4 MS: 1 ShuffleBytes- 00:06:59.773 [2024-11-29 09:30:22.593666] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.593694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:06:59.773 [2024-11-29 09:30:22.593816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:06:59.773 [2024-11-29 09:30:22.593833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.033 #33 NEW cov: 11784 ft: 15007 corp: 32/74b lim: 5 exec/s: 33 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:00.033 [2024-11-29 09:30:22.654003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.033 [2024-11-29 09:30:22.654032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.033 [2024-11-29 09:30:22.654156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.033 [2024-11-29 09:30:22.654173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.033 #34 NEW cov: 11784 ft: 15070 corp: 33/76b lim: 5 exec/s: 34 rss: 70Mb L: 2/4 MS: 1 ChangeByte- 00:07:00.033 [2024-11-29 09:30:22.704771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.033 [2024-11-29 09:30:22.704799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.033 [2024-11-29 09:30:22.704929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.033 [2024-11-29 09:30:22.704945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.033 [2024-11-29 09:30:22.705074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.033 [2024-11-29 09:30:22.705091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.033 [2024-11-29 09:30:22.705218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.705235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.034 #35 NEW cov: 11784 ft: 15109 corp: 34/80b lim: 5 exec/s: 35 rss: 70Mb L: 4/4 MS: 1 ChangeBinInt- 00:07:00.034 [2024-11-29 09:30:22.765138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.765170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.765282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.765299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.765427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.765443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.765560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.765576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.765699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.765717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:00.034 #36 NEW cov: 11784 ft: 15229 corp: 35/85b lim: 5 exec/s: 36 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:07:00.034 [2024-11-29 09:30:22.814776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.814804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.814923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.814939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.815055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.815074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.034 #37 NEW cov: 11784 ft: 15242 corp: 36/88b lim: 5 exec/s: 37 rss: 70Mb L: 3/5 MS: 1 CopyPart- 00:07:00.034 [2024-11-29 09:30:22.854910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.854937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.855052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.855068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.034 [2024-11-29 09:30:22.855184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.034 [2024-11-29 09:30:22.855200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.034 #38 NEW cov: 11784 ft: 15309 corp: 37/91b lim: 5 exec/s: 38 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:00.294 [2024-11-29 09:30:22.895060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.895091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.294 [2024-11-29 09:30:22.895211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.895228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.294 [2024-11-29 09:30:22.895341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.895359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.294 #39 NEW cov: 11784 ft: 15329 corp: 38/94b lim: 5 exec/s: 39 rss: 70Mb L: 3/5 MS: 1 ChangeBit- 00:07:00.294 [2024-11-29 09:30:22.945071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.945097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.294 [2024-11-29 09:30:22.945216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.945232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.294 [2024-11-29 09:30:22.945350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.945366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:00.294 #40 NEW cov: 11784 ft: 15370 corp: 39/97b lim: 5 exec/s: 40 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:00.294 [2024-11-29 09:30:22.984915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.984943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.294 [2024-11-29 09:30:22.985059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:00.294 [2024-11-29 09:30:22.985075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:00.294 #41 NEW cov: 11784 ft: 15373 corp: 40/99b lim: 5 exec/s: 20 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:00.294 #41 DONE cov: 11784 ft: 15373 corp: 40/99b lim: 5 exec/s: 20 rss: 70Mb 00:07:00.294 Done 41 runs in 2 second(s) 00:07:00.294 09:30:23 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:00.294 09:30:23 -- ../common.sh@72 -- # (( i++ )) 00:07:00.294 09:30:23 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:00.294 09:30:23 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:00.294 09:30:23 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:00.294 09:30:23 -- nvmf/run.sh@24 -- # local timen=1 00:07:00.294 09:30:23 -- nvmf/run.sh@25 -- # local core=0x1 00:07:00.294 09:30:23 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:00.294 09:30:23 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:00.294 09:30:23 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:00.294 09:30:23 -- nvmf/run.sh@29 -- # port=4410 00:07:00.294 09:30:23 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:00.554 09:30:23 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:00.554 09:30:23 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:00.554 09:30:23 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:00.554 [2024-11-29 09:30:23.169167] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:00.554 [2024-11-29 09:30:23.169233] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3183287 ] 00:07:00.554 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.554 [2024-11-29 09:30:23.343058] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.813 [2024-11-29 09:30:23.406058] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:00.813 [2024-11-29 09:30:23.406204] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.813 [2024-11-29 09:30:23.464075] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:00.813 [2024-11-29 09:30:23.480427] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:00.813 INFO: Running with entropic power schedule (0xFF, 100). 00:07:00.813 INFO: Seed: 466709442 00:07:00.813 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:00.813 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:00.813 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:00.813 INFO: A corpus is not provided, starting from an empty corpus 00:07:00.813 #2 INITED exec/s: 0 rss: 60Mb 00:07:00.813 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:00.813 This may also happen if the target rejected all inputs we tried so far 00:07:00.813 [2024-11-29 09:30:23.550045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.813 [2024-11-29 09:30:23.550078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:00.813 [2024-11-29 09:30:23.550219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:00.813 [2024-11-29 09:30:23.550235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.072 NEW_FUNC[1/669]: 0x447688 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:01.072 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:01.072 #16 NEW cov: 11578 ft: 11578 corp: 2/23b lim: 40 exec/s: 0 rss: 68Mb L: 22/22 MS: 4 InsertByte-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:07:01.072 [2024-11-29 09:30:23.871067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.072 [2024-11-29 09:30:23.871105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.072 [2024-11-29 09:30:23.871236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.072 [2024-11-29 09:30:23.871255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.072 [2024-11-29 09:30:23.871357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008803 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.072 [2024-11-29 09:30:23.871377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.072 NEW_FUNC[1/1]: 0x1789ac8 in nvme_tcp_read_data /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk_internal/nvme_tcp.h:412 00:07:01.072 #19 NEW cov: 11693 ft: 12283 corp: 3/47b lim: 40 exec/s: 0 rss: 68Mb L: 24/24 MS: 3 ChangeByte-InsertByte-CrossOver- 00:07:01.072 [2024-11-29 09:30:23.910237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.072 [2024-11-29 09:30:23.910267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 #20 NEW cov: 11699 ft: 13142 corp: 4/60b lim: 40 exec/s: 0 rss: 68Mb L: 13/24 MS: 1 EraseBytes- 00:07:01.332 [2024-11-29 09:30:23.951307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:23.951338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:23.951475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:23.951491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:23.951613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00880300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:23.951629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.332 #21 NEW cov: 11784 ft: 13466 corp: 5/90b lim: 40 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CrossOver- 00:07:01.332 [2024-11-29 09:30:24.001140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.001168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.001290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.001306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.332 #22 NEW cov: 11784 ft: 13524 corp: 6/112b lim: 40 exec/s: 0 rss: 68Mb L: 22/30 MS: 1 CopyPart- 00:07:01.332 [2024-11-29 09:30:24.041448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.041477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.041604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.041619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.041740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00008803 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.041756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.332 #23 NEW cov: 11784 ft: 13634 corp: 7/136b lim: 40 exec/s: 0 rss: 68Mb L: 24/30 MS: 1 ShuffleBytes- 00:07:01.332 [2024-11-29 09:30:24.081605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.081635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.081758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.081774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.081911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00008800 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.081927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.332 #24 NEW cov: 11784 ft: 13751 corp: 8/161b lim: 40 exec/s: 0 rss: 68Mb L: 25/30 MS: 1 CopyPart- 00:07:01.332 [2024-11-29 09:30:24.121493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3138e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.121521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.121659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.121675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.332 #29 NEW cov: 11784 ft: 13780 corp: 9/183b lim: 40 exec/s: 0 rss: 68Mb L: 22/30 MS: 5 ShuffleBytes-InsertByte-CopyPart-ChangeBinInt-InsertRepeatedBytes- 00:07:01.332 [2024-11-29 09:30:24.162062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.162090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.162224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.162241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.162369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.162386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.332 [2024-11-29 09:30:24.162521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00880300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.332 [2024-11-29 09:30:24.162540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.593 #30 NEW cov: 11784 ft: 14282 corp: 10/221b lim: 40 exec/s: 0 rss: 68Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:07:01.593 [2024-11-29 09:30:24.212012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.212040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.212174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.212190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.212319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00808800 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.212337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.593 #31 NEW cov: 11784 ft: 14378 corp: 11/246b lim: 40 exec/s: 0 rss: 68Mb L: 25/38 MS: 1 ChangeBit- 00:07:01.593 [2024-11-29 09:30:24.262405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.262432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.262573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.262592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.262725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.262743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.262864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00880300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.262882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.593 #32 NEW cov: 11784 ft: 14400 corp: 12/284b lim: 40 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 ChangeBinInt- 00:07:01.593 [2024-11-29 09:30:24.312492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.312521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.312638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.312657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.312790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.312807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.312932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.312948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.593 #33 NEW cov: 11784 ft: 14431 corp: 13/323b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 CrossOver- 00:07:01.593 [2024-11-29 09:30:24.362118] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.362146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.362281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.362300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.362439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00882700 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.362456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.593 #39 NEW cov: 11784 ft: 14463 corp: 14/353b lim: 40 exec/s: 0 rss: 69Mb L: 30/39 MS: 1 ChangeByte- 00:07:01.593 [2024-11-29 09:30:24.402570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00003a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.402602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.402737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.402753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.593 [2024-11-29 09:30:24.402883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00008827 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.593 [2024-11-29 09:30:24.402900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.593 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:01.593 #40 NEW cov: 11807 ft: 14517 corp: 15/384b lim: 40 exec/s: 0 rss: 69Mb L: 31/39 MS: 1 InsertByte- 00:07:01.853 [2024-11-29 09:30:24.453004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.453032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.453165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ff004000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.453180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.453315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.453330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.453460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00880300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.453475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:01.853 #41 NEW cov: 11807 ft: 14526 corp: 16/422b lim: 40 exec/s: 0 rss: 69Mb L: 38/39 MS: 1 ChangeBit- 00:07:01.853 [2024-11-29 09:30:24.492326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.492354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.853 #42 NEW cov: 11807 ft: 14557 corp: 17/435b lim: 40 exec/s: 0 rss: 69Mb L: 13/39 MS: 1 ChangeBinInt- 00:07:01.853 [2024-11-29 09:30:24.533013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.533040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.533172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.533190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.533324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00882700 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.533340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.853 #43 NEW cov: 11807 ft: 14573 corp: 18/464b lim: 40 exec/s: 43 rss: 69Mb L: 29/39 MS: 1 EraseBytes- 00:07:01.853 [2024-11-29 09:30:24.572680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00003a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.572708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.853 #44 NEW cov: 11807 ft: 14588 corp: 19/472b lim: 40 exec/s: 44 rss: 69Mb L: 8/39 MS: 1 CrossOver- 00:07:01.853 [2024-11-29 09:30:24.613206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.613235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.613370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.853 [2024-11-29 09:30:24.613386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.853 [2024-11-29 09:30:24.613509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00882700 cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.613524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.854 #45 NEW cov: 11807 ft: 14596 corp: 20/502b lim: 40 exec/s: 45 rss: 69Mb L: 30/39 MS: 1 CMP- DE: "\377\377\377\377\001 \307\336"- 00:07:01.854 [2024-11-29 09:30:24.653111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.653137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.854 [2024-11-29 09:30:24.653278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.653295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.854 #46 NEW cov: 11807 ft: 14611 corp: 21/524b lim: 40 exec/s: 46 rss: 69Mb L: 22/39 MS: 1 InsertRepeatedBytes- 00:07:01.854 [2024-11-29 09:30:24.693784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.693812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:01.854 [2024-11-29 09:30:24.693937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.693956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:01.854 [2024-11-29 09:30:24.694084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:9e9e9e00 cdw11:88270000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.694102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:01.854 [2024-11-29 09:30:24.694226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:01.854 [2024-11-29 09:30:24.694242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.113 #47 NEW cov: 11807 ft: 14617 corp: 22/556b lim: 40 exec/s: 47 rss: 69Mb L: 32/39 MS: 1 InsertRepeatedBytes- 00:07:02.113 [2024-11-29 09:30:24.733628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.733655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.733793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.733808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.733942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.733957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.113 #48 NEW cov: 11807 ft: 14622 corp: 23/581b lim: 40 exec/s: 48 rss: 69Mb L: 25/39 MS: 1 InsertByte- 00:07:02.113 [2024-11-29 09:30:24.773197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.773223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.113 #49 NEW cov: 11807 ft: 14682 corp: 24/590b lim: 40 exec/s: 49 rss: 69Mb L: 9/39 MS: 1 EraseBytes- 00:07:02.113 [2024-11-29 09:30:24.814060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.814087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.814215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:ffff0120 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.814231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.814353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:c7de9e00 cdw11:88270000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.814368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.814505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.814520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.113 #50 NEW cov: 11807 ft: 14686 corp: 25/622b lim: 40 exec/s: 50 rss: 69Mb L: 32/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\001 \307\336"- 00:07:02.113 [2024-11-29 09:30:24.854219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.854247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.854389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.854405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.854507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:88270000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.854523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.854660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.854677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.113 #51 NEW cov: 11807 ft: 14692 corp: 26/655b lim: 40 exec/s: 51 rss: 69Mb L: 33/39 MS: 1 InsertRepeatedBytes- 00:07:02.113 [2024-11-29 09:30:24.893822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3138e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.893849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.113 [2024-11-29 09:30:24.893978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.893994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.113 #52 NEW cov: 11807 ft: 14730 corp: 27/673b lim: 40 exec/s: 52 rss: 70Mb L: 18/39 MS: 1 EraseBytes- 00:07:02.113 [2024-11-29 09:30:24.933839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.113 [2024-11-29 09:30:24.933868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 #53 NEW cov: 11807 ft: 14739 corp: 28/686b lim: 40 exec/s: 53 rss: 70Mb L: 13/39 MS: 1 EraseBytes- 00:07:02.373 [2024-11-29 09:30:24.974368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4d000000 cdw11:00000004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:24.974395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:24.974526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:24.974541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:24.974673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:24.974689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.373 #54 NEW cov: 11807 ft: 14740 corp: 29/711b lim: 40 exec/s: 54 rss: 70Mb L: 25/39 MS: 1 ChangeBinInt- 00:07:02.373 [2024-11-29 09:30:25.014389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.014416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.014546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.014564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.014696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:0120c7de SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.014713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.054755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.054782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.054922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.054938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.055065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.055082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.055210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ffffff01 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.055226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.373 #56 NEW cov: 11807 ft: 14744 corp: 30/746b lim: 40 exec/s: 56 rss: 70Mb L: 35/39 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:02.373 [2024-11-29 09:30:25.094675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:4d000000 cdw11:00000004 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.094702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.094828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:003d4e37 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.094844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.094985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ec000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.095002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.373 #57 NEW cov: 11807 ft: 14771 corp: 31/775b lim: 40 exec/s: 57 rss: 70Mb L: 29/39 MS: 1 CMP- DE: "=N7\354"- 00:07:02.373 [2024-11-29 09:30:25.145109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00003d4e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.145138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.145273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:37ec0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.145290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.145417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00882700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.145437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.373 [2024-11-29 09:30:25.145564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.145580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.373 #58 NEW cov: 11807 ft: 14794 corp: 32/808b lim: 40 exec/s: 58 rss: 70Mb L: 33/39 MS: 1 PersAutoDict- DE: "=N7\354"- 00:07:02.373 [2024-11-29 09:30:25.184540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00008803 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.373 [2024-11-29 09:30:25.184567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.373 #59 NEW cov: 11807 ft: 14801 corp: 33/816b lim: 40 exec/s: 59 rss: 70Mb L: 8/39 MS: 1 EraseBytes- 00:07:02.632 [2024-11-29 09:30:25.224796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.224823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.224954] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:3d000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.224971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.225108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00008800 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.225124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.632 #60 NEW cov: 11807 ft: 14826 corp: 34/841b lim: 40 exec/s: 60 rss: 70Mb L: 25/39 MS: 1 ChangeByte- 00:07:02.632 [2024-11-29 09:30:25.265391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.265418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.265556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.265573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.265712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:88270000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.265729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.265857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.265872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.632 #61 NEW cov: 11807 ft: 14850 corp: 35/874b lim: 40 exec/s: 61 rss: 70Mb L: 33/39 MS: 1 ShuffleBytes- 00:07:02.632 [2024-11-29 09:30:25.315513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.315540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.315691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.315708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.315835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00008800 cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.315852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.315970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.315986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.632 #62 NEW cov: 11807 ft: 14855 corp: 36/911b lim: 40 exec/s: 62 rss: 70Mb L: 37/39 MS: 1 InsertRepeatedBytes- 00:07:02.632 [2024-11-29 09:30:25.355295] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.355323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.355471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.355487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.632 #63 NEW cov: 11807 ft: 14870 corp: 37/934b lim: 40 exec/s: 63 rss: 70Mb L: 23/39 MS: 1 EraseBytes- 00:07:02.632 [2024-11-29 09:30:25.395557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:b7ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.395587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.395723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.395741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.395866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000088 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.395883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.632 #64 NEW cov: 11807 ft: 14876 corp: 38/959b lim: 40 exec/s: 64 rss: 70Mb L: 25/39 MS: 1 ChangeBinInt- 00:07:02.632 [2024-11-29 09:30:25.435486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:3138e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.632 [2024-11-29 09:30:25.435515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.632 [2024-11-29 09:30:25.435647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:e9e9e9e9 cdw11:e9e9e9e9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.633 [2024-11-29 09:30:25.435663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.633 #65 NEW cov: 11807 ft: 14911 corp: 39/977b lim: 40 exec/s: 65 rss: 70Mb L: 18/39 MS: 1 ShuffleBytes- 00:07:02.893 [2024-11-29 09:30:25.485851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00003a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.485884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.893 [2024-11-29 09:30:25.486017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.486034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.893 [2024-11-29 09:30:25.486168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00008827 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.486185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.893 #66 NEW cov: 11807 ft: 14950 corp: 40/1001b lim: 40 exec/s: 66 rss: 70Mb L: 24/39 MS: 1 EraseBytes- 00:07:02.893 [2024-11-29 09:30:25.536091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00ffffff cdw11:ff0120c7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.536119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:02.893 [2024-11-29 09:30:25.536258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:de000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.536275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:02.893 [2024-11-29 09:30:25.536401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.536418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:02.893 [2024-11-29 09:30:25.536543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00882700 cdw11:0000ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:02.893 [2024-11-29 09:30:25.536559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:02.893 #67 NEW cov: 11807 ft: 14971 corp: 41/1039b lim: 40 exec/s: 33 rss: 70Mb L: 38/39 MS: 1 PersAutoDict- DE: "\377\377\377\377\001 \307\336"- 00:07:02.893 #67 DONE cov: 11807 ft: 14971 corp: 41/1039b lim: 40 exec/s: 33 rss: 70Mb 00:07:02.893 ###### Recommended dictionary. ###### 00:07:02.893 "\377\377\377\377\001 \307\336" # Uses: 2 00:07:02.893 "=N7\354" # Uses: 1 00:07:02.893 ###### End of recommended dictionary. ###### 00:07:02.893 Done 67 runs in 2 second(s) 00:07:02.893 09:30:25 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:02.893 09:30:25 -- ../common.sh@72 -- # (( i++ )) 00:07:02.893 09:30:25 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:02.893 09:30:25 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:02.893 09:30:25 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:02.893 09:30:25 -- nvmf/run.sh@24 -- # local timen=1 00:07:02.893 09:30:25 -- nvmf/run.sh@25 -- # local core=0x1 00:07:02.893 09:30:25 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:02.893 09:30:25 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:02.893 09:30:25 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:02.893 09:30:25 -- nvmf/run.sh@29 -- # port=4411 00:07:02.893 09:30:25 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:02.893 09:30:25 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:02.893 09:30:25 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:02.893 09:30:25 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:02.893 [2024-11-29 09:30:25.724721] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:02.893 [2024-11-29 09:30:25.724790] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3183722 ] 00:07:03.153 EAL: No free 2048 kB hugepages reported on node 1 00:07:03.153 [2024-11-29 09:30:25.906255] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:03.153 [2024-11-29 09:30:25.969994] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:03.153 [2024-11-29 09:30:25.970125] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:03.413 [2024-11-29 09:30:26.028357] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:03.413 [2024-11-29 09:30:26.044720] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:03.413 INFO: Running with entropic power schedule (0xFF, 100). 00:07:03.413 INFO: Seed: 3032707371 00:07:03.413 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:03.413 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:03.413 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:03.413 INFO: A corpus is not provided, starting from an empty corpus 00:07:03.413 #2 INITED exec/s: 0 rss: 60Mb 00:07:03.413 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:03.413 This may also happen if the target rejected all inputs we tried so far 00:07:03.413 [2024-11-29 09:30:26.089435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.413 [2024-11-29 09:30:26.089471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.413 [2024-11-29 09:30:26.089520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.413 [2024-11-29 09:30:26.089536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.413 [2024-11-29 09:30:26.089567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.413 [2024-11-29 09:30:26.089582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.672 NEW_FUNC[1/671]: 0x4493f8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:03.672 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:03.672 #15 NEW cov: 11592 ft: 11593 corp: 2/28b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:07:03.672 [2024-11-29 09:30:26.410156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.672 [2024-11-29 09:30:26.410194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.672 [2024-11-29 09:30:26.410228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.672 [2024-11-29 09:30:26.410244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.672 #17 NEW cov: 11705 ft: 12158 corp: 3/49b lim: 40 exec/s: 0 rss: 68Mb L: 21/27 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:03.672 [2024-11-29 09:30:26.460127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.672 [2024-11-29 09:30:26.460158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.672 [2024-11-29 09:30:26.460205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.672 [2024-11-29 09:30:26.460221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.672 #18 NEW cov: 11711 ft: 12372 corp: 4/71b lim: 40 exec/s: 0 rss: 68Mb L: 22/27 MS: 1 InsertByte- 00:07:03.933 [2024-11-29 09:30:26.520259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.520288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.933 #19 NEW cov: 11796 ft: 13485 corp: 5/82b lim: 40 exec/s: 0 rss: 68Mb L: 11/27 MS: 1 EraseBytes- 00:07:03.933 [2024-11-29 09:30:26.580419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.580451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.580498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.580514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.933 #20 NEW cov: 11796 ft: 13702 corp: 6/104b lim: 40 exec/s: 0 rss: 68Mb L: 22/27 MS: 1 CrossOver- 00:07:03.933 [2024-11-29 09:30:26.640684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.640715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.640748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.640764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.640794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.640809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.933 #22 NEW cov: 11796 ft: 13759 corp: 7/131b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:03.933 [2024-11-29 09:30:26.690799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.690838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.690886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.690901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.690931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.690946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:03.933 #23 NEW cov: 11796 ft: 13844 corp: 8/158b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 ChangeByte- 00:07:03.933 [2024-11-29 09:30:26.761012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.761041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.761089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.761104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:03.933 [2024-11-29 09:30:26.761134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:03.933 [2024-11-29 09:30:26.761149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.193 #24 NEW cov: 11796 ft: 13898 corp: 9/185b lim: 40 exec/s: 0 rss: 68Mb L: 27/27 MS: 1 ChangeBinInt- 00:07:04.193 [2024-11-29 09:30:26.831367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.831397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.831430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0a0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.831446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.831475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.831490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.831519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.831534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.831562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.831577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:04.193 #25 NEW cov: 11796 ft: 14309 corp: 10/225b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 CrossOver- 00:07:04.193 [2024-11-29 09:30:26.901395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.901426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.901459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.901475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.901504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff29 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.901520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.193 #26 NEW cov: 11796 ft: 14417 corp: 11/249b lim: 40 exec/s: 0 rss: 68Mb L: 24/40 MS: 1 CMP- DE: "\377\013"- 00:07:04.193 [2024-11-29 09:30:26.951476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.951506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:26.951540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:26.951555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.193 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:04.193 #27 NEW cov: 11813 ft: 14464 corp: 12/271b lim: 40 exec/s: 0 rss: 69Mb L: 22/40 MS: 1 ShuffleBytes- 00:07:04.193 [2024-11-29 09:30:27.001671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:27.001703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:27.001737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:18ffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:27.001753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.193 [2024-11-29 09:30:27.001784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff29 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.193 [2024-11-29 09:30:27.001799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.453 #28 NEW cov: 11813 ft: 14483 corp: 13/295b lim: 40 exec/s: 0 rss: 69Mb L: 24/40 MS: 1 ChangeBinInt- 00:07:04.453 [2024-11-29 09:30:27.071849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.071880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.453 [2024-11-29 09:30:27.071913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.071928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.453 [2024-11-29 09:30:27.071958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.071973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.453 #29 NEW cov: 11813 ft: 14505 corp: 14/322b lim: 40 exec/s: 29 rss: 69Mb L: 27/40 MS: 1 CrossOver- 00:07:04.453 [2024-11-29 09:30:27.121946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.121977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.453 [2024-11-29 09:30:27.122027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.122043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.453 [2024-11-29 09:30:27.122079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.122095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.453 #30 NEW cov: 11813 ft: 14606 corp: 15/349b lim: 40 exec/s: 30 rss: 69Mb L: 27/40 MS: 1 ShuffleBytes- 00:07:04.453 [2024-11-29 09:30:27.182140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.453 [2024-11-29 09:30:27.182171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.453 [2024-11-29 09:30:27.182219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2f2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.182235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.454 [2024-11-29 09:30:27.182265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.182281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.454 #31 NEW cov: 11813 ft: 14695 corp: 16/376b lim: 40 exec/s: 31 rss: 69Mb L: 27/40 MS: 1 CopyPart- 00:07:04.454 [2024-11-29 09:30:27.232269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.232300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.454 [2024-11-29 09:30:27.232334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.232349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.454 [2024-11-29 09:30:27.232379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.232394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.454 #32 NEW cov: 11813 ft: 14733 corp: 17/404b lim: 40 exec/s: 32 rss: 69Mb L: 28/40 MS: 1 InsertByte- 00:07:04.454 [2024-11-29 09:30:27.282307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.282338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.454 [2024-11-29 09:30:27.282371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.454 [2024-11-29 09:30:27.282386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.713 #38 NEW cov: 11813 ft: 14778 corp: 18/421b lim: 40 exec/s: 38 rss: 69Mb L: 17/40 MS: 1 EraseBytes- 00:07:04.713 [2024-11-29 09:30:27.332493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.713 [2024-11-29 09:30:27.332523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.713 [2024-11-29 09:30:27.332557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.713 [2024-11-29 09:30:27.332573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.714 [2024-11-29 09:30:27.332618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.332634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.714 #39 NEW cov: 11813 ft: 14848 corp: 19/445b lim: 40 exec/s: 39 rss: 69Mb L: 24/40 MS: 1 CopyPart- 00:07:04.714 [2024-11-29 09:30:27.382646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.382676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.714 [2024-11-29 09:30:27.382710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e0e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.382725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.714 [2024-11-29 09:30:27.382755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.382770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.714 #40 NEW cov: 11813 ft: 14952 corp: 20/472b lim: 40 exec/s: 40 rss: 69Mb L: 27/40 MS: 1 ChangeBit- 00:07:04.714 [2024-11-29 09:30:27.442770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e36 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.442800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.714 [2024-11-29 09:30:27.442833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.442849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.714 #41 NEW cov: 11813 ft: 15046 corp: 21/490b lim: 40 exec/s: 41 rss: 69Mb L: 18/40 MS: 1 InsertByte- 00:07:04.714 [2024-11-29 09:30:27.512991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.513021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.714 [2024-11-29 09:30:27.513069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:18ffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.513084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.714 [2024-11-29 09:30:27.513115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffaeff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.714 [2024-11-29 09:30:27.513130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.973 #42 NEW cov: 11813 ft: 15063 corp: 22/515b lim: 40 exec/s: 42 rss: 69Mb L: 25/40 MS: 1 InsertByte- 00:07:04.973 [2024-11-29 09:30:27.583053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.973 [2024-11-29 09:30:27.583081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.974 #43 NEW cov: 11813 ft: 15094 corp: 23/524b lim: 40 exec/s: 43 rss: 69Mb L: 9/40 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\002"- 00:07:04.974 [2024-11-29 09:30:27.643293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.643323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.643371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.643386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.643416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.643431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.974 #44 NEW cov: 11813 ft: 15097 corp: 24/553b lim: 40 exec/s: 44 rss: 69Mb L: 29/40 MS: 1 CrossOver- 00:07:04.974 [2024-11-29 09:30:27.703502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.703531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.703579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:18ffff0b cdw11:ffff7fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.703595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.703631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff29 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.703647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.974 #45 NEW cov: 11813 ft: 15119 corp: 25/577b lim: 40 exec/s: 45 rss: 69Mb L: 24/40 MS: 1 ChangeBit- 00:07:04.974 [2024-11-29 09:30:27.754930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.755005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.755138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.755178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.755299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:0100ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.755337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:04.974 #46 NEW cov: 11813 ft: 15210 corp: 26/603b lim: 40 exec/s: 46 rss: 69Mb L: 26/40 MS: 1 CMP- DE: "\001\000"- 00:07:04.974 [2024-11-29 09:30:27.814665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:282e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.814691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.814751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e0e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.814765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:04.974 [2024-11-29 09:30:27.814826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:04.974 [2024-11-29 09:30:27.814840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.233 #47 NEW cov: 11813 ft: 15266 corp: 27/630b lim: 40 exec/s: 47 rss: 69Mb L: 27/40 MS: 1 ChangeBinInt- 00:07:05.233 [2024-11-29 09:30:27.854902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.854927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.855003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.855016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.855077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:02000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.855090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.855149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.855162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:05.234 #48 NEW cov: 11813 ft: 15323 corp: 28/665b lim: 40 exec/s: 48 rss: 69Mb L: 35/40 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\002"- 00:07:05.234 [2024-11-29 09:30:27.894873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.894898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.894977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.894991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.895051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.895064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.234 #49 NEW cov: 11813 ft: 15339 corp: 29/692b lim: 40 exec/s: 49 rss: 69Mb L: 27/40 MS: 1 ChangeByte- 00:07:05.234 [2024-11-29 09:30:27.935047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.935072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.935133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.935146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.935208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:dfffff29 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.935221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.234 #50 NEW cov: 11813 ft: 15368 corp: 30/716b lim: 40 exec/s: 50 rss: 69Mb L: 24/40 MS: 1 ChangeBit- 00:07:05.234 [2024-11-29 09:30:27.975117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.975142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.975219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.975233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:27.975293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:27.975306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.234 #51 NEW cov: 11820 ft: 15384 corp: 31/744b lim: 40 exec/s: 51 rss: 69Mb L: 28/40 MS: 1 InsertByte- 00:07:05.234 [2024-11-29 09:30:28.015224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:28.015250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:28.015325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:ffffff0b cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:28.015339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:28.015400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:28.015413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.234 #52 NEW cov: 11820 ft: 15393 corp: 32/768b lim: 40 exec/s: 52 rss: 69Mb L: 24/40 MS: 1 ShuffleBytes- 00:07:05.234 [2024-11-29 09:30:28.055322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:28.055348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:28.055400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:28.055413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.234 [2024-11-29 09:30:28.055472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:2f2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.234 [2024-11-29 09:30:28.055485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.234 #53 NEW cov: 11820 ft: 15398 corp: 33/795b lim: 40 exec/s: 53 rss: 69Mb L: 27/40 MS: 1 ShuffleBytes- 00:07:05.494 [2024-11-29 09:30:28.095480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a0affff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.494 [2024-11-29 09:30:28.095505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:05.494 [2024-11-29 09:30:28.095567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:18ffff0b cdw11:ffff7fff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.494 [2024-11-29 09:30:28.095584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:05.494 [2024-11-29 09:30:28.095648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:fffffff6 cdw11:ffffff29 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:05.494 [2024-11-29 09:30:28.095661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:05.494 #54 NEW cov: 11820 ft: 15412 corp: 34/819b lim: 40 exec/s: 27 rss: 69Mb L: 24/40 MS: 1 ChangeBinInt- 00:07:05.494 #54 DONE cov: 11820 ft: 15412 corp: 34/819b lim: 40 exec/s: 27 rss: 69Mb 00:07:05.494 ###### Recommended dictionary. ###### 00:07:05.494 "\377\013" # Uses: 1 00:07:05.494 "\000\000\000\000\000\000\000\002" # Uses: 1 00:07:05.494 "\001\000" # Uses: 0 00:07:05.494 ###### End of recommended dictionary. ###### 00:07:05.494 Done 54 runs in 2 second(s) 00:07:05.494 09:30:28 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:05.494 09:30:28 -- ../common.sh@72 -- # (( i++ )) 00:07:05.494 09:30:28 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:05.494 09:30:28 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:05.494 09:30:28 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:05.494 09:30:28 -- nvmf/run.sh@24 -- # local timen=1 00:07:05.494 09:30:28 -- nvmf/run.sh@25 -- # local core=0x1 00:07:05.494 09:30:28 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:05.494 09:30:28 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:05.494 09:30:28 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:05.494 09:30:28 -- nvmf/run.sh@29 -- # port=4412 00:07:05.494 09:30:28 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:05.494 09:30:28 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:05.494 09:30:28 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:05.494 09:30:28 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:05.494 [2024-11-29 09:30:28.287814] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:05.494 [2024-11-29 09:30:28.287901] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3184124 ] 00:07:05.494 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.754 [2024-11-29 09:30:28.467059] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.754 [2024-11-29 09:30:28.531422] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:05.754 [2024-11-29 09:30:28.531565] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.754 [2024-11-29 09:30:28.589861] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:06.013 [2024-11-29 09:30:28.606229] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:06.013 INFO: Running with entropic power schedule (0xFF, 100). 00:07:06.013 INFO: Seed: 1299738708 00:07:06.013 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:06.013 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:06.013 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:06.013 INFO: A corpus is not provided, starting from an empty corpus 00:07:06.013 #2 INITED exec/s: 0 rss: 60Mb 00:07:06.013 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:06.013 This may also happen if the target rejected all inputs we tried so far 00:07:06.013 [2024-11-29 09:30:28.672411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d1d1d1d1 cdw11:d1d1d1d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.013 [2024-11-29 09:30:28.672452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.273 NEW_FUNC[1/671]: 0x44b168 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:06.273 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:06.273 #26 NEW cov: 11590 ft: 11591 corp: 2/10b lim: 40 exec/s: 0 rss: 68Mb L: 9/9 MS: 4 ChangeBit-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:06.273 [2024-11-29 09:30:29.003939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.273 [2024-11-29 09:30:29.003989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.273 [2024-11-29 09:30:29.004124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.273 [2024-11-29 09:30:29.004146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.273 [2024-11-29 09:30:29.004272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.273 [2024-11-29 09:30:29.004293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.273 [2024-11-29 09:30:29.004426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.273 [2024-11-29 09:30:29.004448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.273 #27 NEW cov: 11703 ft: 12928 corp: 3/47b lim: 40 exec/s: 0 rss: 68Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:07:06.273 [2024-11-29 09:30:29.053085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.273 [2024-11-29 09:30:29.053115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.273 #28 NEW cov: 11709 ft: 13144 corp: 4/56b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 1 ChangeBinInt- 00:07:06.273 [2024-11-29 09:30:29.093151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.273 [2024-11-29 09:30:29.093180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #29 NEW cov: 11794 ft: 13460 corp: 5/65b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 1 ChangeBit- 00:07:06.533 [2024-11-29 09:30:29.133311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e32 cdw11:0000002e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.133339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #30 NEW cov: 11794 ft: 13578 corp: 6/74b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 1 CMP- DE: "2\000\000\000"- 00:07:06.533 [2024-11-29 09:30:29.173406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.173433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #31 NEW cov: 11794 ft: 13692 corp: 7/88b lim: 40 exec/s: 0 rss: 68Mb L: 14/37 MS: 1 InsertRepeatedBytes- 00:07:06.533 [2024-11-29 09:30:29.213560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.213590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #32 NEW cov: 11794 ft: 13729 corp: 8/103b lim: 40 exec/s: 0 rss: 68Mb L: 15/37 MS: 1 InsertRepeatedBytes- 00:07:06.533 [2024-11-29 09:30:29.253716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.253742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #35 NEW cov: 11794 ft: 13774 corp: 9/112b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 3 ChangeByte-CrossOver-CrossOver- 00:07:06.533 [2024-11-29 09:30:29.293774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d1d1d1d1 cdw11:d1d1d1d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.293802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #36 NEW cov: 11794 ft: 13801 corp: 10/121b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 1 ChangeBinInt- 00:07:06.533 [2024-11-29 09:30:29.333986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d1000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.334014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.533 #37 NEW cov: 11794 ft: 13894 corp: 11/130b lim: 40 exec/s: 0 rss: 68Mb L: 9/37 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\001"- 00:07:06.533 [2024-11-29 09:30:29.374102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2ef9ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.533 [2024-11-29 09:30:29.374128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.793 #38 NEW cov: 11794 ft: 13926 corp: 12/144b lim: 40 exec/s: 0 rss: 68Mb L: 14/37 MS: 1 ChangeBinInt- 00:07:06.794 [2024-11-29 09:30:29.414424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2e2e0000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.414451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.414578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00012e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.414595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.794 #39 NEW cov: 11794 ft: 14164 corp: 13/161b lim: 40 exec/s: 0 rss: 69Mb L: 17/37 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:07:06.794 [2024-11-29 09:30:29.465153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.465180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.465305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.465323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.465445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:32000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.465460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.465579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.465603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.794 #40 NEW cov: 11794 ft: 14178 corp: 14/198b lim: 40 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 PersAutoDict- DE: "2\000\000\000"- 00:07:06.794 [2024-11-29 09:30:29.515347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:fffffdff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.515374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.515478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.515495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.515610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.515626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:06.794 [2024-11-29 09:30:29.515748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.515765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:06.794 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:06.794 #41 NEW cov: 11817 ft: 14188 corp: 15/235b lim: 40 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 ChangeBinInt- 00:07:06.794 [2024-11-29 09:30:29.554625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.554653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.794 #42 NEW cov: 11817 ft: 14315 corp: 16/250b lim: 40 exec/s: 0 rss: 69Mb L: 15/37 MS: 1 InsertByte- 00:07:06.794 [2024-11-29 09:30:29.594695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d1000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.594722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:06.794 #43 NEW cov: 11817 ft: 14342 corp: 17/264b lim: 40 exec/s: 0 rss: 69Mb L: 14/37 MS: 1 CopyPart- 00:07:06.794 [2024-11-29 09:30:29.634768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2e2eb9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:06.794 [2024-11-29 09:30:29.634793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 #44 NEW cov: 11817 ft: 14346 corp: 18/272b lim: 40 exec/s: 44 rss: 69Mb L: 8/37 MS: 1 EraseBytes- 00:07:07.054 [2024-11-29 09:30:29.675274] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.675303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.675428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffff33 cdw11:2eec2ebb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.675446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.054 #45 NEW cov: 11817 ft: 14431 corp: 19/288b lim: 40 exec/s: 45 rss: 69Mb L: 16/37 MS: 1 InsertByte- 00:07:07.054 [2024-11-29 09:30:29.725133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.725161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 #46 NEW cov: 11817 ft: 14450 corp: 20/303b lim: 40 exec/s: 46 rss: 69Mb L: 15/37 MS: 1 ShuffleBytes- 00:07:07.054 [2024-11-29 09:30:29.765180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2e2ebb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.765206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 #47 NEW cov: 11817 ft: 14467 corp: 21/311b lim: 40 exec/s: 47 rss: 69Mb L: 8/37 MS: 1 ChangeBit- 00:07:07.054 [2024-11-29 09:30:29.805337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a2f2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.805364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 #48 NEW cov: 11817 ft: 14474 corp: 22/319b lim: 40 exec/s: 48 rss: 69Mb L: 8/37 MS: 1 CrossOver- 00:07:07.054 [2024-11-29 09:30:29.836282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.836307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.836429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.836445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.836562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.836577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.836706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.836723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.054 #49 NEW cov: 11817 ft: 14542 corp: 23/357b lim: 40 exec/s: 49 rss: 69Mb L: 38/38 MS: 1 CrossOver- 00:07:07.054 [2024-11-29 09:30:29.876288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2e2e0000 cdw11:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.876314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.876444] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.876460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.876584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.876602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.054 [2024-11-29 09:30:29.876728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff000001 cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.054 [2024-11-29 09:30:29.876748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.314 #50 NEW cov: 11817 ft: 14557 corp: 24/392b lim: 40 exec/s: 50 rss: 69Mb L: 35/38 MS: 1 InsertRepeatedBytes- 00:07:07.314 [2024-11-29 09:30:29.926520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:29.926546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.314 [2024-11-29 09:30:29.926667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:29.926684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.314 [2024-11-29 09:30:29.926806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:32000000 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:29.926822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.314 [2024-11-29 09:30:29.926940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:29.926955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.314 #51 NEW cov: 11817 ft: 14593 corp: 25/430b lim: 40 exec/s: 51 rss: 69Mb L: 38/38 MS: 1 InsertByte- 00:07:07.314 [2024-11-29 09:30:29.976178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:2e2e2e2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:29.976205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.314 [2024-11-29 09:30:29.976328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:2e2e2e2e cdw11:2e2e372e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:29.976344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.314 #52 NEW cov: 11817 ft: 14636 corp: 26/448b lim: 40 exec/s: 52 rss: 69Mb L: 18/38 MS: 1 CrossOver- 00:07:07.314 [2024-11-29 09:30:30.015960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2a2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:30.015989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.314 #53 NEW cov: 11817 ft: 14745 corp: 27/457b lim: 40 exec/s: 53 rss: 69Mb L: 9/38 MS: 1 ChangeBit- 00:07:07.314 [2024-11-29 09:30:30.056128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffff7e2c cdw11:580fb815 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:30.056157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.314 #54 NEW cov: 11817 ft: 14819 corp: 28/466b lim: 40 exec/s: 54 rss: 69Mb L: 9/38 MS: 1 CMP- DE: "\377\377~,X\017\270\025"- 00:07:07.314 [2024-11-29 09:30:30.106306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:40d12f2e cdw11:2e2e2e2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:30.106337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.314 #55 NEW cov: 11817 ft: 14872 corp: 29/475b lim: 40 exec/s: 55 rss: 69Mb L: 9/38 MS: 1 InsertByte- 00:07:07.314 [2024-11-29 09:30:30.146752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:30.146783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.314 [2024-11-29 09:30:30.146902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:2e2e2eb9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.314 [2024-11-29 09:30:30.146919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.577 #56 NEW cov: 11817 ft: 14885 corp: 30/491b lim: 40 exec/s: 56 rss: 69Mb L: 16/38 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\001"- 00:07:07.577 [2024-11-29 09:30:30.187122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d1ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.187150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.577 [2024-11-29 09:30:30.187272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.187289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.577 [2024-11-29 09:30:30.187409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffff00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.187425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.577 #57 NEW cov: 11817 ft: 15081 corp: 31/518b lim: 40 exec/s: 57 rss: 69Mb L: 27/38 MS: 1 InsertRepeatedBytes- 00:07:07.577 [2024-11-29 09:30:30.226638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.226665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.577 #58 NEW cov: 11817 ft: 15120 corp: 32/533b lim: 40 exec/s: 58 rss: 69Mb L: 15/38 MS: 1 ChangeByte- 00:07:07.577 [2024-11-29 09:30:30.266723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffd12f cdw11:2effffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.266752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.577 #59 NEW cov: 11817 ft: 15134 corp: 33/548b lim: 40 exec/s: 59 rss: 69Mb L: 15/38 MS: 1 CrossOver- 00:07:07.577 [2024-11-29 09:30:30.306725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:bf2ed0d1 cdw11:d1d1d1d1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.306760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.577 #60 NEW cov: 11817 ft: 15158 corp: 34/557b lim: 40 exec/s: 60 rss: 70Mb L: 9/38 MS: 1 ChangeBinInt- 00:07:07.577 [2024-11-29 09:30:30.357313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.357343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.577 [2024-11-29 09:30:30.357464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:70707070 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.357481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.577 #66 NEW cov: 11817 ft: 15170 corp: 35/578b lim: 40 exec/s: 66 rss: 70Mb L: 21/38 MS: 1 InsertRepeatedBytes- 00:07:07.577 [2024-11-29 09:30:30.407403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.407436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.577 [2024-11-29 09:30:30.407561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.577 [2024-11-29 09:30:30.407578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.903 #67 NEW cov: 11817 ft: 15175 corp: 36/601b lim: 40 exec/s: 67 rss: 70Mb L: 23/38 MS: 1 EraseBytes- 00:07:07.903 [2024-11-29 09:30:30.457649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.457679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.457804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff7e2c58 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.457820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.903 #68 NEW cov: 11817 ft: 15178 corp: 37/623b lim: 40 exec/s: 68 rss: 70Mb L: 22/38 MS: 1 PersAutoDict- DE: "\377\377~,X\017\270\025"- 00:07:07.903 [2024-11-29 09:30:30.497886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.497914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.498036] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff2f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.498054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.498169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:f9ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.498184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.498310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff2e2ebb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.498326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.903 #69 NEW cov: 11817 ft: 15187 corp: 38/661b lim: 40 exec/s: 69 rss: 70Mb L: 38/38 MS: 1 CrossOver- 00:07:07.903 [2024-11-29 09:30:30.537212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2e2ebb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.537240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.903 #70 NEW cov: 11817 ft: 15230 corp: 39/669b lim: 40 exec/s: 70 rss: 70Mb L: 8/38 MS: 1 CopyPart- 00:07:07.903 [2024-11-29 09:30:30.577301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2eff2e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.577329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.903 #71 NEW cov: 11817 ft: 15252 corp: 40/679b lim: 40 exec/s: 71 rss: 70Mb L: 10/38 MS: 1 EraseBytes- 00:07:07.903 [2024-11-29 09:30:30.617775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:d12f2e2e cdw11:2e2effff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.617803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.903 #72 NEW cov: 11817 ft: 15260 corp: 41/693b lim: 40 exec/s: 72 rss: 70Mb L: 14/38 MS: 1 PersAutoDict- DE: "2\000\000\000"- 00:07:07.903 [2024-11-29 09:30:30.658691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:feffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.658718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.658828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffff2f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.658847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.658974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:2e2e2e2e cdw11:f9ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.658989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:07.903 [2024-11-29 09:30:30.659096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff2e2ebb cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:07.903 [2024-11-29 09:30:30.659111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:07.903 #73 NEW cov: 11817 ft: 15262 corp: 42/731b lim: 40 exec/s: 36 rss: 70Mb L: 38/38 MS: 1 ChangeBit- 00:07:07.903 #73 DONE cov: 11817 ft: 15262 corp: 42/731b lim: 40 exec/s: 36 rss: 70Mb 00:07:07.903 ###### Recommended dictionary. ###### 00:07:07.903 "2\000\000\000" # Uses: 3 00:07:07.903 "\000\000\000\000\000\000\000\001" # Uses: 2 00:07:07.903 "\377\377~,X\017\270\025" # Uses: 1 00:07:07.903 ###### End of recommended dictionary. ###### 00:07:07.903 Done 73 runs in 2 second(s) 00:07:08.181 09:30:30 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:08.181 09:30:30 -- ../common.sh@72 -- # (( i++ )) 00:07:08.181 09:30:30 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:08.181 09:30:30 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:08.181 09:30:30 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:08.181 09:30:30 -- nvmf/run.sh@24 -- # local timen=1 00:07:08.181 09:30:30 -- nvmf/run.sh@25 -- # local core=0x1 00:07:08.181 09:30:30 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:08.181 09:30:30 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:08.181 09:30:30 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:08.181 09:30:30 -- nvmf/run.sh@29 -- # port=4413 00:07:08.181 09:30:30 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:08.181 09:30:30 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:08.181 09:30:30 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:08.181 09:30:30 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:08.181 [2024-11-29 09:30:30.854713] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:08.181 [2024-11-29 09:30:30.854777] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3184673 ] 00:07:08.181 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.449 [2024-11-29 09:30:31.106647] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.449 [2024-11-29 09:30:31.197310] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:08.449 [2024-11-29 09:30:31.197453] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.449 [2024-11-29 09:30:31.255220] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:08.449 [2024-11-29 09:30:31.271574] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:08.449 INFO: Running with entropic power schedule (0xFF, 100). 00:07:08.449 INFO: Seed: 3964754019 00:07:08.708 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:08.708 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:08.708 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:08.708 INFO: A corpus is not provided, starting from an empty corpus 00:07:08.708 #2 INITED exec/s: 0 rss: 61Mb 00:07:08.708 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:08.708 This may also happen if the target rejected all inputs we tried so far 00:07:08.708 [2024-11-29 09:30:31.316331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.708 [2024-11-29 09:30:31.316364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.708 [2024-11-29 09:30:31.316413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.708 [2024-11-29 09:30:31.316428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.708 [2024-11-29 09:30:31.316458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.708 [2024-11-29 09:30:31.316473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.708 [2024-11-29 09:30:31.316502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.708 [2024-11-29 09:30:31.316517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.967 NEW_FUNC[1/670]: 0x44cd38 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:08.967 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:08.967 #3 NEW cov: 11578 ft: 11568 corp: 2/33b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:08.968 [2024-11-29 09:30:31.637066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.637106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.637140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.637155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.637185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.637200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.637229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.637244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.968 #14 NEW cov: 11691 ft: 12143 corp: 3/65b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 ChangeByte- 00:07:08.968 [2024-11-29 09:30:31.707177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.707211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.707261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9b9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.707277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.707306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.707322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.707351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.707367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:08.968 #15 NEW cov: 11697 ft: 12336 corp: 4/97b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 ChangeBit- 00:07:08.968 [2024-11-29 09:30:31.777332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.777365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.777414] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.777431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.777461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.777476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:08.968 [2024-11-29 09:30:31.777505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:08.968 [2024-11-29 09:30:31.777521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.227 #16 NEW cov: 11782 ft: 12557 corp: 5/129b lim: 40 exec/s: 0 rss: 68Mb L: 32/32 MS: 1 CrossOver- 00:07:09.227 [2024-11-29 09:30:31.827427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.827459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:31.827493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.827508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:31.827537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.827556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:31.827586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:1fa9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.827610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.227 #17 NEW cov: 11782 ft: 12734 corp: 6/162b lim: 40 exec/s: 0 rss: 68Mb L: 33/33 MS: 1 InsertByte- 00:07:09.227 [2024-11-29 09:30:31.877429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a90aa9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.877459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:31.877506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.877523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.227 #18 NEW cov: 11782 ft: 13366 corp: 7/184b lim: 40 exec/s: 0 rss: 68Mb L: 22/33 MS: 1 CrossOver- 00:07:09.227 [2024-11-29 09:30:31.937590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a90aa9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.937642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:31.937677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:31.937692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.227 #19 NEW cov: 11782 ft: 13442 corp: 8/206b lim: 40 exec/s: 0 rss: 68Mb L: 22/33 MS: 1 ShuffleBytes- 00:07:09.227 [2024-11-29 09:30:32.007859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:32.007890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:32.007923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:32.007938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:32.007967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:32.007983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.227 #20 NEW cov: 11782 ft: 13709 corp: 9/230b lim: 40 exec/s: 0 rss: 68Mb L: 24/33 MS: 1 EraseBytes- 00:07:09.227 [2024-11-29 09:30:32.058007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:32.058036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:32.058067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.227 [2024-11-29 09:30:32.058082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.227 [2024-11-29 09:30:32.058114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.228 [2024-11-29 09:30:32.058128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.228 [2024-11-29 09:30:32.058155] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a97ba9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.228 [2024-11-29 09:30:32.058169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.487 #26 NEW cov: 11782 ft: 13815 corp: 10/266b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 CopyPart- 00:07:09.487 [2024-11-29 09:30:32.118167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.118196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.118244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.118259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.118289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.118304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.118332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a97ba9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.118347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.487 #27 NEW cov: 11782 ft: 13867 corp: 11/302b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 ShuffleBytes- 00:07:09.487 [2024-11-29 09:30:32.178310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.178338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.178386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.178401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.178430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.178446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.178474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a97ba9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.178489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.487 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:09.487 #28 NEW cov: 11799 ft: 13918 corp: 12/338b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 ShuffleBytes- 00:07:09.487 [2024-11-29 09:30:32.248498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.248530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.248578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.248593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.248630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.248645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.487 [2024-11-29 09:30:32.248674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.248689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.487 #29 NEW cov: 11799 ft: 13935 corp: 13/374b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:09.487 [2024-11-29 09:30:32.298474] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e0100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.487 [2024-11-29 09:30:32.298502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.746 #34 NEW cov: 11799 ft: 14291 corp: 14/384b lim: 40 exec/s: 34 rss: 69Mb L: 10/36 MS: 5 ShuffleBytes-ChangeByte-InsertByte-ChangeBit-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:09.746 [2024-11-29 09:30:32.358772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.358802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.358835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.358850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.358878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a97b cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.358894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.746 #35 NEW cov: 11799 ft: 14311 corp: 15/409b lim: 40 exec/s: 35 rss: 69Mb L: 25/36 MS: 1 EraseBytes- 00:07:09.746 [2024-11-29 09:30:32.418996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.419026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.419074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9b9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.419090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.419120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.419135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.419168] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.419184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.746 #36 NEW cov: 11799 ft: 14433 corp: 16/441b lim: 40 exec/s: 36 rss: 69Mb L: 32/36 MS: 1 ShuffleBytes- 00:07:09.746 [2024-11-29 09:30:32.479143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.479172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.479205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.479220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.479249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a95756 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.479265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:09.746 [2024-11-29 09:30:32.479293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:56568456 cdw11:e056a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.479308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:09.746 #37 NEW cov: 11799 ft: 14470 corp: 17/474b lim: 40 exec/s: 37 rss: 69Mb L: 33/36 MS: 1 ChangeBinInt- 00:07:09.746 [2024-11-29 09:30:32.539157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e0001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:09.746 [2024-11-29 09:30:32.539187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.005 #38 NEW cov: 11799 ft: 14490 corp: 18/484b lim: 40 exec/s: 38 rss: 69Mb L: 10/36 MS: 1 ShuffleBytes- 00:07:10.005 [2024-11-29 09:30:32.609443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.609472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.005 [2024-11-29 09:30:32.609520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.609535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.005 [2024-11-29 09:30:32.609563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ada9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.609578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.005 [2024-11-29 09:30:32.609614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.609629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.005 #39 NEW cov: 11799 ft: 14536 corp: 19/516b lim: 40 exec/s: 39 rss: 69Mb L: 32/36 MS: 1 ChangeBinInt- 00:07:10.005 [2024-11-29 09:30:32.659463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e00a9 cdw11:a9a90000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.659497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.005 #40 NEW cov: 11799 ft: 14551 corp: 20/526b lim: 40 exec/s: 40 rss: 69Mb L: 10/36 MS: 1 CrossOver- 00:07:10.005 [2024-11-29 09:30:32.729622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e0100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.729650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.005 #41 NEW cov: 11799 ft: 14585 corp: 21/541b lim: 40 exec/s: 41 rss: 69Mb L: 15/36 MS: 1 InsertRepeatedBytes- 00:07:10.005 [2024-11-29 09:30:32.790361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.790389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.005 #42 NEW cov: 11799 ft: 14660 corp: 22/551b lim: 40 exec/s: 42 rss: 69Mb L: 10/36 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:10.005 [2024-11-29 09:30:32.830839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.830865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.005 [2024-11-29 09:30:32.830933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9b9a901 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.830946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.005 [2024-11-29 09:30:32.831002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:000000a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.831015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.005 [2024-11-29 09:30:32.831085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.005 [2024-11-29 09:30:32.831099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.264 #43 NEW cov: 11799 ft: 14735 corp: 23/583b lim: 40 exec/s: 43 rss: 69Mb L: 32/36 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:10.264 [2024-11-29 09:30:32.870635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.870661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.264 #44 NEW cov: 11799 ft: 14758 corp: 24/593b lim: 40 exec/s: 44 rss: 69Mb L: 10/36 MS: 1 ShuffleBytes- 00:07:10.264 [2024-11-29 09:30:32.911085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.911111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.264 [2024-11-29 09:30:32.911179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a90100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.911193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.264 [2024-11-29 09:30:32.911248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.911264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.264 [2024-11-29 09:30:32.911316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.911329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.264 #45 NEW cov: 11799 ft: 14775 corp: 25/625b lim: 40 exec/s: 45 rss: 69Mb L: 32/36 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:10.264 [2024-11-29 09:30:32.951177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.951203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.264 [2024-11-29 09:30:32.951256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.951270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.264 [2024-11-29 09:30:32.951321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.951334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.264 [2024-11-29 09:30:32.951386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a925a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.264 [2024-11-29 09:30:32.951399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.264 #46 NEW cov: 11799 ft: 14779 corp: 26/658b lim: 40 exec/s: 46 rss: 69Mb L: 33/36 MS: 1 InsertByte- 00:07:10.265 [2024-11-29 09:30:32.991323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.265 [2024-11-29 09:30:32.991348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.265 [2024-11-29 09:30:32.991403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.265 [2024-11-29 09:30:32.991416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.265 [2024-11-29 09:30:32.991470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.265 [2024-11-29 09:30:32.991482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.265 [2024-11-29 09:30:32.991535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a97ba9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.265 [2024-11-29 09:30:32.991548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.265 #47 NEW cov: 11799 ft: 14794 corp: 27/694b lim: 40 exec/s: 47 rss: 69Mb L: 36/36 MS: 1 ChangeBit- 00:07:10.265 [2024-11-29 09:30:33.031044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e0100 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.265 [2024-11-29 09:30:33.031070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.265 #48 NEW cov: 11799 ft: 14847 corp: 28/704b lim: 40 exec/s: 48 rss: 69Mb L: 10/36 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:10.265 [2024-11-29 09:30:33.071199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e0100 cdw11:002f0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.265 [2024-11-29 09:30:33.071225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.265 #49 NEW cov: 11799 ft: 14874 corp: 29/714b lim: 40 exec/s: 49 rss: 69Mb L: 10/36 MS: 1 ChangeByte- 00:07:10.524 [2024-11-29 09:30:33.111696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.111723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.111777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9b9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.111791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.111843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.111857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.111909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:7ba9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.111922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.524 #50 NEW cov: 11799 ft: 14906 corp: 30/746b lim: 40 exec/s: 50 rss: 69Mb L: 32/36 MS: 1 ShuffleBytes- 00:07:10.524 [2024-11-29 09:30:33.151793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.151819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.151872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9b9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.151885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.151937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.151950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.152002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a97ba9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.152014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.181884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.181909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.181964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a8 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.181978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.524 [2024-11-29 09:30:33.182035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9b9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.524 [2024-11-29 09:30:33.182047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.525 [2024-11-29 09:30:33.182100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.182112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.525 #52 NEW cov: 11799 ft: 14953 corp: 31/778b lim: 40 exec/s: 52 rss: 69Mb L: 32/36 MS: 2 ChangeBit-CopyPart- 00:07:10.525 [2024-11-29 09:30:33.221648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:fb5e0001 cdw11:00080000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.221673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.525 #53 NEW cov: 11806 ft: 14966 corp: 32/788b lim: 40 exec/s: 53 rss: 69Mb L: 10/36 MS: 1 ChangeBit- 00:07:10.525 [2024-11-29 09:30:33.262164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0aa9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.262189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.525 [2024-11-29 09:30:33.262242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.262256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:10.525 [2024-11-29 09:30:33.262310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a9a9a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.262322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:10.525 [2024-11-29 09:30:33.262375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:a9a9a9a9 cdw11:a9a97ba9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.262387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:10.525 #54 NEW cov: 11806 ft: 15044 corp: 33/827b lim: 40 exec/s: 54 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:10.525 [2024-11-29 09:30:33.301885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:db5e0001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:10.525 [2024-11-29 09:30:33.301910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:10.525 #55 NEW cov: 11806 ft: 15045 corp: 34/837b lim: 40 exec/s: 27 rss: 69Mb L: 10/39 MS: 1 ChangeBit- 00:07:10.525 #55 DONE cov: 11806 ft: 15045 corp: 34/837b lim: 40 exec/s: 27 rss: 69Mb 00:07:10.525 ###### Recommended dictionary. ###### 00:07:10.525 "\001\000\000\000\000\000\000\000" # Uses: 4 00:07:10.525 ###### End of recommended dictionary. ###### 00:07:10.525 Done 55 runs in 2 second(s) 00:07:10.784 09:30:33 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:07:10.784 09:30:33 -- ../common.sh@72 -- # (( i++ )) 00:07:10.784 09:30:33 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:10.784 09:30:33 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:07:10.784 09:30:33 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:07:10.784 09:30:33 -- nvmf/run.sh@24 -- # local timen=1 00:07:10.784 09:30:33 -- nvmf/run.sh@25 -- # local core=0x1 00:07:10.784 09:30:33 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:10.784 09:30:33 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:07:10.784 09:30:33 -- nvmf/run.sh@29 -- # printf %02d 14 00:07:10.784 09:30:33 -- nvmf/run.sh@29 -- # port=4414 00:07:10.784 09:30:33 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:10.784 09:30:33 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:07:10.784 09:30:33 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:10.784 09:30:33 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:07:10.784 [2024-11-29 09:30:33.484318] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:10.784 [2024-11-29 09:30:33.484393] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3185161 ] 00:07:10.784 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.043 [2024-11-29 09:30:33.737190] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.044 [2024-11-29 09:30:33.828178] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:11.044 [2024-11-29 09:30:33.828320] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.303 [2024-11-29 09:30:33.886377] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:11.303 [2024-11-29 09:30:33.902728] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:07:11.303 INFO: Running with entropic power schedule (0xFF, 100). 00:07:11.303 INFO: Seed: 2301758775 00:07:11.303 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:11.303 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:11.303 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:07:11.303 INFO: A corpus is not provided, starting from an empty corpus 00:07:11.303 #2 INITED exec/s: 0 rss: 60Mb 00:07:11.303 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:11.303 This may also happen if the target rejected all inputs we tried so far 00:07:11.303 [2024-11-29 09:30:33.958003] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.303 [2024-11-29 09:30:33.958035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.562 NEW_FUNC[1/671]: 0x44e908 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:07:11.562 NEW_FUNC[2/671]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:11.562 #8 NEW cov: 11577 ft: 11578 corp: 2/11b lim: 35 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:11.562 [2024-11-29 09:30:34.258756] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.562 [2024-11-29 09:30:34.258788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.562 NEW_FUNC[1/1]: 0x170aae8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:07:11.562 #9 NEW cov: 11692 ft: 12074 corp: 3/22b lim: 35 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 CrossOver- 00:07:11.562 [2024-11-29 09:30:34.308840] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.562 [2024-11-29 09:30:34.308868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.562 #10 NEW cov: 11698 ft: 12282 corp: 4/33b lim: 35 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 ShuffleBytes- 00:07:11.562 [2024-11-29 09:30:34.348918] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.562 [2024-11-29 09:30:34.348945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.562 #11 NEW cov: 11783 ft: 12628 corp: 5/44b lim: 35 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 ShuffleBytes- 00:07:11.562 [2024-11-29 09:30:34.389063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.562 [2024-11-29 09:30:34.389091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #12 NEW cov: 11783 ft: 12703 corp: 6/55b lim: 35 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 ShuffleBytes- 00:07:11.821 [2024-11-29 09:30:34.429172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.821 [2024-11-29 09:30:34.429199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #18 NEW cov: 11783 ft: 12776 corp: 7/65b lim: 35 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 ChangeBit- 00:07:11.821 [2024-11-29 09:30:34.469283] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.821 [2024-11-29 09:30:34.469311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #24 NEW cov: 11783 ft: 12846 corp: 8/76b lim: 35 exec/s: 0 rss: 69Mb L: 11/11 MS: 1 ChangeByte- 00:07:11.821 [2024-11-29 09:30:34.509416] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.821 [2024-11-29 09:30:34.509443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #25 NEW cov: 11783 ft: 12859 corp: 9/86b lim: 35 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 ShuffleBytes- 00:07:11.821 [2024-11-29 09:30:34.539515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.821 [2024-11-29 09:30:34.539542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #26 NEW cov: 11783 ft: 12958 corp: 10/98b lim: 35 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 InsertByte- 00:07:11.821 [2024-11-29 09:30:34.579612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.821 [2024-11-29 09:30:34.579640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #27 NEW cov: 11786 ft: 13050 corp: 11/109b lim: 35 exec/s: 0 rss: 69Mb L: 11/12 MS: 1 ChangeBinInt- 00:07:11.821 [2024-11-29 09:30:34.619755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.821 [2024-11-29 09:30:34.619781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:11.821 #28 NEW cov: 11786 ft: 13066 corp: 12/120b lim: 35 exec/s: 0 rss: 69Mb L: 11/12 MS: 1 CopyPart- 00:07:11.822 [2024-11-29 09:30:34.659846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:11.822 [2024-11-29 09:30:34.659873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.081 #29 NEW cov: 11786 ft: 13115 corp: 13/132b lim: 35 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 ChangeBinInt- 00:07:12.081 [2024-11-29 09:30:34.699986] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.081 [2024-11-29 09:30:34.700016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.081 #30 NEW cov: 11786 ft: 13133 corp: 14/144b lim: 35 exec/s: 0 rss: 70Mb L: 12/12 MS: 1 ChangeByte- 00:07:12.081 [2024-11-29 09:30:34.740068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.081 [2024-11-29 09:30:34.740095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.081 #31 NEW cov: 11786 ft: 13159 corp: 15/155b lim: 35 exec/s: 0 rss: 70Mb L: 11/12 MS: 1 ChangeBinInt- 00:07:12.081 [2024-11-29 09:30:34.780195] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.081 [2024-11-29 09:30:34.780223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.081 #32 NEW cov: 11786 ft: 13173 corp: 16/166b lim: 35 exec/s: 0 rss: 70Mb L: 11/12 MS: 1 CMP- DE: "\001\223\317\326\275\203\360b"- 00:07:12.081 [2024-11-29 09:30:34.810264] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.081 [2024-11-29 09:30:34.810290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.081 #38 NEW cov: 11786 ft: 13175 corp: 17/173b lim: 35 exec/s: 0 rss: 70Mb L: 7/12 MS: 1 EraseBytes- 00:07:12.081 [2024-11-29 09:30:34.840316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000028 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.081 [2024-11-29 09:30:34.840342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.081 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:12.081 #39 NEW cov: 11816 ft: 13237 corp: 18/185b lim: 35 exec/s: 0 rss: 70Mb L: 12/12 MS: 1 InsertByte- 00:07:12.081 NEW_FUNC[1/1]: 0x1133ce8 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1651 00:07:12.081 #40 NEW cov: 11839 ft: 13277 corp: 19/197b lim: 35 exec/s: 0 rss: 70Mb L: 12/12 MS: 1 PersAutoDict- DE: "\001\223\317\326\275\203\360b"- 00:07:12.340 [2024-11-29 09:30:34.930876] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:34.930903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 [2024-11-29 09:30:34.930976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:34.930992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.340 #41 NEW cov: 11839 ft: 14062 corp: 20/212b lim: 35 exec/s: 41 rss: 70Mb L: 15/15 MS: 1 CrossOver- 00:07:12.340 [2024-11-29 09:30:34.980852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:34.980880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 #42 NEW cov: 11839 ft: 14071 corp: 21/223b lim: 35 exec/s: 42 rss: 70Mb L: 11/15 MS: 1 ChangeByte- 00:07:12.340 [2024-11-29 09:30:35.020922] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:35.020949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 #43 NEW cov: 11839 ft: 14164 corp: 22/234b lim: 35 exec/s: 43 rss: 70Mb L: 11/15 MS: 1 ChangeBinInt- 00:07:12.340 [2024-11-29 09:30:35.060978] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:35.061005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 #44 NEW cov: 11839 ft: 14217 corp: 23/245b lim: 35 exec/s: 44 rss: 70Mb L: 11/15 MS: 1 PersAutoDict- DE: "\001\223\317\326\275\203\360b"- 00:07:12.340 [2024-11-29 09:30:35.101136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:35.101164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 #45 NEW cov: 11839 ft: 14232 corp: 24/256b lim: 35 exec/s: 45 rss: 70Mb L: 11/15 MS: 1 ShuffleBytes- 00:07:12.340 [2024-11-29 09:30:35.131240] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:35.131265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 #46 NEW cov: 11839 ft: 14246 corp: 25/267b lim: 35 exec/s: 46 rss: 70Mb L: 11/15 MS: 1 ChangeByte- 00:07:12.340 [2024-11-29 09:30:35.161352] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.340 [2024-11-29 09:30:35.161379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.340 #47 NEW cov: 11839 ft: 14301 corp: 26/278b lim: 35 exec/s: 47 rss: 70Mb L: 11/15 MS: 1 ChangeBinInt- 00:07:12.599 #48 NEW cov: 11839 ft: 14314 corp: 27/290b lim: 35 exec/s: 48 rss: 70Mb L: 12/15 MS: 1 InsertByte- 00:07:12.599 [2024-11-29 09:30:35.241620] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.599 [2024-11-29 09:30:35.241649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.599 #49 NEW cov: 11839 ft: 14364 corp: 28/301b lim: 35 exec/s: 49 rss: 70Mb L: 11/15 MS: 1 ChangeBit- 00:07:12.599 [2024-11-29 09:30:35.282295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.599 [2024-11-29 09:30:35.282322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.599 [2024-11-29 09:30:35.282399] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.599 [2024-11-29 09:30:35.282415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.599 [2024-11-29 09:30:35.282475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.599 [2024-11-29 09:30:35.282490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.599 [2024-11-29 09:30:35.282549] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.599 [2024-11-29 09:30:35.282564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.599 #50 NEW cov: 11839 ft: 14749 corp: 29/333b lim: 35 exec/s: 50 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:12.599 [2024-11-29 09:30:35.322332] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.599 [2024-11-29 09:30:35.322359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.599 [2024-11-29 09:30:35.322438] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.600 [2024-11-29 09:30:35.322454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.600 [2024-11-29 09:30:35.322513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.600 [2024-11-29 09:30:35.322528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:12.600 [2024-11-29 09:30:35.322584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000c8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.600 [2024-11-29 09:30:35.322603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:12.600 #51 NEW cov: 11839 ft: 14814 corp: 30/366b lim: 35 exec/s: 51 rss: 70Mb L: 33/33 MS: 1 InsertByte- 00:07:12.600 [2024-11-29 09:30:35.371995] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.600 [2024-11-29 09:30:35.372022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.600 #52 NEW cov: 11839 ft: 14838 corp: 31/377b lim: 35 exec/s: 52 rss: 70Mb L: 11/33 MS: 1 ChangeBit- 00:07:12.600 [2024-11-29 09:30:35.412029] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.600 [2024-11-29 09:30:35.412056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.600 #53 NEW cov: 11839 ft: 14859 corp: 32/388b lim: 35 exec/s: 53 rss: 70Mb L: 11/33 MS: 1 ChangeBit- 00:07:12.859 #57 NEW cov: 11839 ft: 14862 corp: 33/395b lim: 35 exec/s: 57 rss: 70Mb L: 7/33 MS: 4 EraseBytes-ChangeBinInt-ChangeBit-CrossOver- 00:07:12.859 [2024-11-29 09:30:35.492363] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.859 [2024-11-29 09:30:35.492391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.859 #58 NEW cov: 11839 ft: 14886 corp: 34/405b lim: 35 exec/s: 58 rss: 70Mb L: 10/33 MS: 1 EraseBytes- 00:07:12.859 [2024-11-29 09:30:35.532455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.859 [2024-11-29 09:30:35.532482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.859 #59 NEW cov: 11839 ft: 14899 corp: 35/416b lim: 35 exec/s: 59 rss: 70Mb L: 11/33 MS: 1 ShuffleBytes- 00:07:12.859 [2024-11-29 09:30:35.572758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.859 [2024-11-29 09:30:35.572784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.859 [2024-11-29 09:30:35.572844] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.859 [2024-11-29 09:30:35.572860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:12.859 #60 NEW cov: 11839 ft: 14909 corp: 36/434b lim: 35 exec/s: 60 rss: 70Mb L: 18/33 MS: 1 CrossOver- 00:07:12.859 [2024-11-29 09:30:35.612690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.859 [2024-11-29 09:30:35.612717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:12.859 #61 NEW cov: 11839 ft: 14910 corp: 37/444b lim: 35 exec/s: 61 rss: 70Mb L: 10/33 MS: 1 CrossOver- 00:07:12.859 #62 NEW cov: 11839 ft: 14926 corp: 38/455b lim: 35 exec/s: 62 rss: 70Mb L: 11/33 MS: 1 InsertRepeatedBytes- 00:07:12.859 [2024-11-29 09:30:35.692951] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:12.859 [2024-11-29 09:30:35.692978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.118 #63 NEW cov: 11839 ft: 14936 corp: 39/467b lim: 35 exec/s: 63 rss: 70Mb L: 12/33 MS: 1 InsertByte- 00:07:13.118 [2024-11-29 09:30:35.733209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.733237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.118 [2024-11-29 09:30:35.733298] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.733312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.118 #64 NEW cov: 11839 ft: 14943 corp: 40/486b lim: 35 exec/s: 64 rss: 70Mb L: 19/33 MS: 1 PersAutoDict- DE: "\001\223\317\326\275\203\360b"- 00:07:13.118 [2024-11-29 09:30:35.773200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.773227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.118 [2024-11-29 09:30:35.813631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.813658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.118 [2024-11-29 09:30:35.813717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.813733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:13.118 [2024-11-29 09:30:35.813792] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:000000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.813806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:13.118 #66 NEW cov: 11839 ft: 15167 corp: 41/509b lim: 35 exec/s: 66 rss: 70Mb L: 23/33 MS: 2 ShuffleBytes-CrossOver- 00:07:13.118 [2024-11-29 09:30:35.853392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.853419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.118 #67 NEW cov: 11839 ft: 15169 corp: 42/519b lim: 35 exec/s: 67 rss: 70Mb L: 10/33 MS: 1 ChangeBinInt- 00:07:13.118 [2024-11-29 09:30:35.893541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.893568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.118 #68 NEW cov: 11839 ft: 15193 corp: 43/531b lim: 35 exec/s: 68 rss: 70Mb L: 12/33 MS: 1 ChangeByte- 00:07:13.118 [2024-11-29 09:30:35.933652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES WRITE ATOMICITY cid:4 cdw10:8000000a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:13.118 [2024-11-29 09:30:35.933679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:13.119 #69 NEW cov: 11839 ft: 15194 corp: 44/542b lim: 35 exec/s: 34 rss: 70Mb L: 11/33 MS: 1 ShuffleBytes- 00:07:13.119 #69 DONE cov: 11839 ft: 15194 corp: 44/542b lim: 35 exec/s: 34 rss: 70Mb 00:07:13.119 ###### Recommended dictionary. ###### 00:07:13.119 "\001\223\317\326\275\203\360b" # Uses: 3 00:07:13.119 ###### End of recommended dictionary. ###### 00:07:13.119 Done 69 runs in 2 second(s) 00:07:13.378 09:30:36 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:07:13.378 09:30:36 -- ../common.sh@72 -- # (( i++ )) 00:07:13.378 09:30:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:13.378 09:30:36 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:07:13.378 09:30:36 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:07:13.378 09:30:36 -- nvmf/run.sh@24 -- # local timen=1 00:07:13.378 09:30:36 -- nvmf/run.sh@25 -- # local core=0x1 00:07:13.378 09:30:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:13.378 09:30:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:07:13.378 09:30:36 -- nvmf/run.sh@29 -- # printf %02d 15 00:07:13.378 09:30:36 -- nvmf/run.sh@29 -- # port=4415 00:07:13.378 09:30:36 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:13.378 09:30:36 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:07:13.378 09:30:36 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:13.378 09:30:36 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:07:13.378 [2024-11-29 09:30:36.115989] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:13.378 [2024-11-29 09:30:36.116061] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3185506 ] 00:07:13.378 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.637 [2024-11-29 09:30:36.373941] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.638 [2024-11-29 09:30:36.459018] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:13.638 [2024-11-29 09:30:36.459150] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.897 [2024-11-29 09:30:36.517490] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:13.897 [2024-11-29 09:30:36.533833] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:07:13.897 INFO: Running with entropic power schedule (0xFF, 100). 00:07:13.897 INFO: Seed: 637811455 00:07:13.897 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:13.897 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:13.897 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:07:13.897 INFO: A corpus is not provided, starting from an empty corpus 00:07:13.897 #2 INITED exec/s: 0 rss: 60Mb 00:07:13.897 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:13.897 This may also happen if the target rejected all inputs we tried so far 00:07:13.897 [2024-11-29 09:30:36.589018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:13.897 [2024-11-29 09:30:36.589046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.157 NEW_FUNC[1/670]: 0x44fe48 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:07:14.157 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:14.157 #12 NEW cov: 11560 ft: 11561 corp: 2/8b lim: 35 exec/s: 0 rss: 68Mb L: 7/7 MS: 5 CopyPart-InsertByte-CMP-ShuffleBytes-InsertByte- DE: "\001\000\000\000"- 00:07:14.157 [2024-11-29 09:30:36.889881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.157 [2024-11-29 09:30:36.889914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.157 [2024-11-29 09:30:36.889972] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.157 [2024-11-29 09:30:36.889986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.157 #16 NEW cov: 11673 ft: 12362 corp: 3/28b lim: 35 exec/s: 0 rss: 68Mb L: 20/20 MS: 4 ChangeBit-ChangeByte-CopyPart-InsertRepeatedBytes- 00:07:14.157 NEW_FUNC[1/1]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:07:14.157 #20 NEW cov: 11693 ft: 12634 corp: 4/35b lim: 35 exec/s: 0 rss: 68Mb L: 7/20 MS: 4 PersAutoDict-InsertByte-ChangeByte-CopyPart- DE: "\001\000\000\000"- 00:07:14.157 [2024-11-29 09:30:36.960246] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.157 [2024-11-29 09:30:36.960273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.157 [2024-11-29 09:30:36.960349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.157 [2024-11-29 09:30:36.960363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.157 [2024-11-29 09:30:36.960420] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.157 [2024-11-29 09:30:36.960433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.157 [2024-11-29 09:30:36.960489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.157 [2024-11-29 09:30:36.960503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.157 #21 NEW cov: 11778 ft: 13436 corp: 5/69b lim: 35 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:14.416 #22 NEW cov: 11778 ft: 13606 corp: 6/77b lim: 35 exec/s: 0 rss: 68Mb L: 8/34 MS: 1 InsertByte- 00:07:14.416 [2024-11-29 09:30:37.050452] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.416 [2024-11-29 09:30:37.050478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.416 [2024-11-29 09:30:37.050552] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.416 [2024-11-29 09:30:37.050566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.416 [2024-11-29 09:30:37.050631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.416 [2024-11-29 09:30:37.050646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.416 #23 NEW cov: 11778 ft: 13886 corp: 7/104b lim: 35 exec/s: 0 rss: 68Mb L: 27/34 MS: 1 CrossOver- 00:07:14.416 [2024-11-29 09:30:37.090504] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.416 [2024-11-29 09:30:37.090529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.416 [2024-11-29 09:30:37.090589] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.416 [2024-11-29 09:30:37.090608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.416 [2024-11-29 09:30:37.090665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.090678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.417 #24 NEW cov: 11778 ft: 13939 corp: 8/131b lim: 35 exec/s: 0 rss: 68Mb L: 27/34 MS: 1 ChangeByte- 00:07:14.417 [2024-11-29 09:30:37.130532] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.130557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.417 [2024-11-29 09:30:37.130622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.130636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.417 #25 NEW cov: 11778 ft: 13964 corp: 9/151b lim: 35 exec/s: 0 rss: 68Mb L: 20/34 MS: 1 ChangeByte- 00:07:14.417 [2024-11-29 09:30:37.170670] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.170695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.417 [2024-11-29 09:30:37.170754] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.170768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.417 #26 NEW cov: 11778 ft: 13979 corp: 10/171b lim: 35 exec/s: 0 rss: 69Mb L: 20/34 MS: 1 CopyPart- 00:07:14.417 #27 NEW cov: 11778 ft: 14040 corp: 11/179b lim: 35 exec/s: 0 rss: 69Mb L: 8/34 MS: 1 ChangeBinInt- 00:07:14.417 [2024-11-29 09:30:37.251136] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.251161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.417 [2024-11-29 09:30:37.251219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.251232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.417 [2024-11-29 09:30:37.251288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.251302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.417 [2024-11-29 09:30:37.251356] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.417 [2024-11-29 09:30:37.251370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.676 #28 NEW cov: 11778 ft: 14095 corp: 12/213b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:07:14.676 [2024-11-29 09:30:37.291213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.291239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.676 NEW_FUNC[1/1]: 0x4691c8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:07:14.676 #31 NEW cov: 11816 ft: 14257 corp: 13/236b lim: 35 exec/s: 0 rss: 69Mb L: 23/34 MS: 3 PersAutoDict-InsertByte-CrossOver- DE: "\001\000\000\000"- 00:07:14.676 #32 NEW cov: 11816 ft: 14347 corp: 14/243b lim: 35 exec/s: 0 rss: 69Mb L: 7/34 MS: 1 CrossOver- 00:07:14.676 [2024-11-29 09:30:37.371590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.371621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.371679] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.371693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.371752] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.371765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.371822] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.371835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.371892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.371905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.676 #33 NEW cov: 11816 ft: 14441 corp: 15/278b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 InsertByte- 00:07:14.676 [2024-11-29 09:30:37.411614] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.411640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.411715] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.411730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.411787] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.411801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.411857] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.411871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.676 #34 NEW cov: 11816 ft: 14447 corp: 16/308b lim: 35 exec/s: 0 rss: 69Mb L: 30/35 MS: 1 InsertRepeatedBytes- 00:07:14.676 [2024-11-29 09:30:37.451884] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.451909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.451966] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.451979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.452055] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.452069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.452125] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.452139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.676 [2024-11-29 09:30:37.452193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.452206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:14.676 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:14.676 #35 NEW cov: 11839 ft: 14483 corp: 17/343b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:07:14.676 [2024-11-29 09:30:37.501901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.676 [2024-11-29 09:30:37.501926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.936 #36 NEW cov: 11839 ft: 14503 corp: 18/366b lim: 35 exec/s: 0 rss: 69Mb L: 23/35 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:14.936 #37 NEW cov: 11839 ft: 14524 corp: 19/374b lim: 35 exec/s: 0 rss: 69Mb L: 8/35 MS: 1 CopyPart- 00:07:14.936 [2024-11-29 09:30:37.582018] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000053d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.582044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.582088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.582102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.582161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.582174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.936 #38 NEW cov: 11839 ft: 14548 corp: 20/401b lim: 35 exec/s: 38 rss: 69Mb L: 27/35 MS: 1 ChangeByte- 00:07:14.936 [2024-11-29 09:30:37.622219] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.622244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.622316] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.622331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.622389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.622402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.622460] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.622474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:14.936 #39 NEW cov: 11839 ft: 14576 corp: 21/435b lim: 35 exec/s: 39 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:14.936 #40 NEW cov: 11839 ft: 14592 corp: 22/447b lim: 35 exec/s: 40 rss: 69Mb L: 12/35 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:14.936 #41 NEW cov: 11839 ft: 14617 corp: 23/455b lim: 35 exec/s: 41 rss: 70Mb L: 8/35 MS: 1 CopyPart- 00:07:14.936 [2024-11-29 09:30:37.742471] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000053d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.742498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.742556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.742570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:14.936 [2024-11-29 09:30:37.742631] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:14.936 [2024-11-29 09:30:37.742644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:14.936 #42 NEW cov: 11839 ft: 14645 corp: 24/482b lim: 35 exec/s: 42 rss: 70Mb L: 27/35 MS: 1 ChangeByte- 00:07:15.195 #43 NEW cov: 11839 ft: 14732 corp: 25/490b lim: 35 exec/s: 43 rss: 70Mb L: 8/35 MS: 1 ChangeBinInt- 00:07:15.195 [2024-11-29 09:30:37.812913] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.812937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.195 [2024-11-29 09:30:37.812996] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.813010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.195 [2024-11-29 09:30:37.813068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.813081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.195 [2024-11-29 09:30:37.813140] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.813152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.195 [2024-11-29 09:30:37.813211] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.813224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.195 #44 NEW cov: 11839 ft: 14746 corp: 26/525b lim: 35 exec/s: 44 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:07:15.195 [2024-11-29 09:30:37.862909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.862935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.195 [2024-11-29 09:30:37.863011] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.195 [2024-11-29 09:30:37.863026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:37.863087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:37.863101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:37.863163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:37.863176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.196 #45 NEW cov: 11839 ft: 14749 corp: 27/559b lim: 35 exec/s: 45 rss: 70Mb L: 34/35 MS: 1 ChangeBit- 00:07:15.196 [2024-11-29 09:30:37.902900] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:37.902926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:37.903001] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:37.903016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:37.903073] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:37.903087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.196 #46 NEW cov: 11839 ft: 14774 corp: 28/580b lim: 35 exec/s: 46 rss: 70Mb L: 21/35 MS: 1 InsertByte- 00:07:15.196 #47 NEW cov: 11839 ft: 14877 corp: 29/588b lim: 35 exec/s: 47 rss: 70Mb L: 8/35 MS: 1 ChangeByte- 00:07:15.196 [2024-11-29 09:30:37.982892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000fa SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:37.982919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.196 #53 NEW cov: 11839 ft: 14898 corp: 30/596b lim: 35 exec/s: 53 rss: 70Mb L: 8/35 MS: 1 ChangeBinInt- 00:07:15.196 [2024-11-29 09:30:38.023383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:38.023408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:38.023467] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:38.023481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:38.023554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:38.023568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.196 [2024-11-29 09:30:38.023630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.196 [2024-11-29 09:30:38.023644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.455 #54 NEW cov: 11839 ft: 14914 corp: 31/627b lim: 35 exec/s: 54 rss: 70Mb L: 31/35 MS: 1 InsertByte- 00:07:15.455 [2024-11-29 09:30:38.063573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.063603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.063680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.063694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.063758] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.063771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.455 #56 NEW cov: 11839 ft: 14926 corp: 32/656b lim: 35 exec/s: 56 rss: 70Mb L: 29/35 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:15.455 #57 NEW cov: 11839 ft: 14955 corp: 33/667b lim: 35 exec/s: 57 rss: 70Mb L: 11/35 MS: 1 CopyPart- 00:07:15.455 [2024-11-29 09:30:38.143841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.143866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.143940] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.143954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.144014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.144028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.144087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.144101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.144158] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.144171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.455 #58 NEW cov: 11839 ft: 14990 corp: 34/702b lim: 35 exec/s: 58 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:07:15.455 [2024-11-29 09:30:38.183948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.183974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.184047] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000002b6 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.184060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.184120] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.184133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.184192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.184206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.184266] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.184280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.455 #59 NEW cov: 11839 ft: 15005 corp: 35/737b lim: 35 exec/s: 59 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:15.455 [2024-11-29 09:30:38.223953] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.223981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.224040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.224054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.224113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.224126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.455 [2024-11-29 09:30:38.224183] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.455 [2024-11-29 09:30:38.224196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.455 #60 NEW cov: 11839 ft: 15019 corp: 36/771b lim: 35 exec/s: 60 rss: 70Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:15.455 #61 NEW cov: 11839 ft: 15038 corp: 37/779b lim: 35 exec/s: 61 rss: 70Mb L: 8/35 MS: 1 ChangeByte- 00:07:15.715 [2024-11-29 09:30:38.304087] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.304112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.304172] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.304185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.304260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.304274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.715 #62 NEW cov: 11839 ft: 15045 corp: 38/800b lim: 35 exec/s: 62 rss: 70Mb L: 21/35 MS: 1 InsertByte- 00:07:15.715 [2024-11-29 09:30:38.344440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000073d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.344466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.344526] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.344540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.344601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.344614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.344690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.344703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.344764] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.344778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.715 #63 NEW cov: 11839 ft: 15099 corp: 39/835b lim: 35 exec/s: 63 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:07:15.715 [2024-11-29 09:30:38.384437] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.384463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.384522] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.384536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.384601] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.384614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.715 [2024-11-29 09:30:38.384689] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.715 [2024-11-29 09:30:38.384702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.716 #64 NEW cov: 11839 ft: 15124 corp: 40/865b lim: 35 exec/s: 64 rss: 70Mb L: 30/35 MS: 1 ChangeBinInt- 00:07:15.716 [2024-11-29 09:30:38.424395] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:0000003d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.424420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.424478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.424492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.424550] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.424563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.716 #65 NEW cov: 11839 ft: 15153 corp: 41/892b lim: 35 exec/s: 65 rss: 70Mb L: 27/35 MS: 1 ChangeBinInt- 00:07:15.716 [2024-11-29 09:30:38.464808] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.464833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.464906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.464919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.464976] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.464989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.465045] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.465058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.465115] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.465128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.716 #66 NEW cov: 11839 ft: 15161 corp: 42/927b lim: 35 exec/s: 66 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:07:15.716 [2024-11-29 09:30:38.504852] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.504876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.504952] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.504966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.505020] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000036c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.505033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.716 #67 NEW cov: 11839 ft: 15174 corp: 43/956b lim: 35 exec/s: 67 rss: 70Mb L: 29/35 MS: 1 ChangeByte- 00:07:15.716 [2024-11-29 09:30:38.544818] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000000f4 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.544843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.716 [2024-11-29 09:30:38.544902] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.716 [2024-11-29 09:30:38.544916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.976 #68 NEW cov: 11839 ft: 15191 corp: 44/979b lim: 35 exec/s: 68 rss: 70Mb L: 23/35 MS: 1 ChangeBinInt- 00:07:15.976 [2024-11-29 09:30:38.585154] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.976 [2024-11-29 09:30:38.585179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:15.976 [2024-11-29 09:30:38.585251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.976 [2024-11-29 09:30:38.585265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:15.976 [2024-11-29 09:30:38.585320] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.976 [2024-11-29 09:30:38.585334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:15.976 [2024-11-29 09:30:38.585392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.976 [2024-11-29 09:30:38.585406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:15.976 [2024-11-29 09:30:38.585463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:15.976 [2024-11-29 09:30:38.585476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:15.976 #69 NEW cov: 11839 ft: 15202 corp: 45/1014b lim: 35 exec/s: 34 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:07:15.976 #69 DONE cov: 11839 ft: 15202 corp: 45/1014b lim: 35 exec/s: 34 rss: 70Mb 00:07:15.976 ###### Recommended dictionary. ###### 00:07:15.976 "\001\000\000\000" # Uses: 4 00:07:15.976 ###### End of recommended dictionary. ###### 00:07:15.976 Done 69 runs in 2 second(s) 00:07:15.976 09:30:38 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:07:15.976 09:30:38 -- ../common.sh@72 -- # (( i++ )) 00:07:15.976 09:30:38 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:15.976 09:30:38 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:07:15.976 09:30:38 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:07:15.976 09:30:38 -- nvmf/run.sh@24 -- # local timen=1 00:07:15.976 09:30:38 -- nvmf/run.sh@25 -- # local core=0x1 00:07:15.976 09:30:38 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:15.976 09:30:38 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:07:15.976 09:30:38 -- nvmf/run.sh@29 -- # printf %02d 16 00:07:15.976 09:30:38 -- nvmf/run.sh@29 -- # port=4416 00:07:15.976 09:30:38 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:15.976 09:30:38 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:07:15.976 09:30:38 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:15.976 09:30:38 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:07:15.976 [2024-11-29 09:30:38.780094] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:15.976 [2024-11-29 09:30:38.780182] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3186048 ] 00:07:15.976 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.234 [2024-11-29 09:30:39.035387] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.493 [2024-11-29 09:30:39.125620] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:16.493 [2024-11-29 09:30:39.125761] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.493 [2024-11-29 09:30:39.183496] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:16.493 [2024-11-29 09:30:39.199865] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:07:16.493 INFO: Running with entropic power schedule (0xFF, 100). 00:07:16.493 INFO: Seed: 3301817593 00:07:16.493 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:16.493 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:16.493 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:07:16.493 INFO: A corpus is not provided, starting from an empty corpus 00:07:16.493 #2 INITED exec/s: 0 rss: 60Mb 00:07:16.493 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:16.493 This may also happen if the target rejected all inputs we tried so far 00:07:16.493 [2024-11-29 09:30:39.270587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.493 [2024-11-29 09:30:39.270629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:16.493 [2024-11-29 09:30:39.270698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:16.493 [2024-11-29 09:30:39.270723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:16.751 NEW_FUNC[1/671]: 0x451308 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:07:16.751 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:16.751 #5 NEW cov: 11663 ft: 11664 corp: 2/46b lim: 105 exec/s: 0 rss: 68Mb L: 45/45 MS: 3 CopyPart-InsertRepeatedBytes-InsertRepeatedBytes- 00:07:17.010 [2024-11-29 09:30:39.610993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.611055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.010 [2024-11-29 09:30:39.611185] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.611211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.010 #16 NEW cov: 11776 ft: 12355 corp: 3/92b lim: 105 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 InsertByte- 00:07:17.010 [2024-11-29 09:30:39.660897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870691594383536 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.660928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.010 [2024-11-29 09:30:39.661055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.661081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.010 #17 NEW cov: 11782 ft: 12698 corp: 4/137b lim: 105 exec/s: 0 rss: 68Mb L: 45/46 MS: 1 ChangeBit- 00:07:17.010 [2024-11-29 09:30:39.701085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.701116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.010 [2024-11-29 09:30:39.701237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.701261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.010 #18 NEW cov: 11867 ft: 12996 corp: 5/183b lim: 105 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 ShuffleBytes- 00:07:17.010 [2024-11-29 09:30:39.751223] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.751253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.010 [2024-11-29 09:30:39.751370] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.751392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.010 #19 NEW cov: 11867 ft: 13090 corp: 6/229b lim: 105 exec/s: 0 rss: 68Mb L: 46/46 MS: 1 CrossOver- 00:07:17.010 [2024-11-29 09:30:39.791276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.791307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.010 [2024-11-29 09:30:39.791416] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.791436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.010 #20 NEW cov: 11867 ft: 13187 corp: 7/276b lim: 105 exec/s: 0 rss: 68Mb L: 47/47 MS: 1 InsertByte- 00:07:17.010 [2024-11-29 09:30:39.831588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.010 [2024-11-29 09:30:39.831626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.010 [2024-11-29 09:30:39.831738] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.011 [2024-11-29 09:30:39.831760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.011 [2024-11-29 09:30:39.831877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.011 [2024-11-29 09:30:39.831896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.269 #21 NEW cov: 11867 ft: 13618 corp: 8/345b lim: 105 exec/s: 0 rss: 68Mb L: 69/69 MS: 1 InsertRepeatedBytes- 00:07:17.269 [2024-11-29 09:30:39.881343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:39.881372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.269 #22 NEW cov: 11867 ft: 14122 corp: 9/385b lim: 105 exec/s: 0 rss: 68Mb L: 40/69 MS: 1 EraseBytes- 00:07:17.269 [2024-11-29 09:30:39.921612] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731883610856009904 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:39.921642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.269 [2024-11-29 09:30:39.921739] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:39.921762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.269 #23 NEW cov: 11867 ft: 14165 corp: 10/432b lim: 105 exec/s: 0 rss: 68Mb L: 47/69 MS: 1 CopyPart- 00:07:17.269 [2024-11-29 09:30:39.961646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:39.961671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.269 #24 NEW cov: 11867 ft: 14190 corp: 11/467b lim: 105 exec/s: 0 rss: 68Mb L: 35/69 MS: 1 EraseBytes- 00:07:17.269 [2024-11-29 09:30:40.001888] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:40.001920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.269 [2024-11-29 09:30:40.002038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:40.002060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.269 #26 NEW cov: 11867 ft: 14236 corp: 12/521b lim: 105 exec/s: 0 rss: 68Mb L: 54/69 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:17.269 [2024-11-29 09:30:40.042491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:40.042519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.269 [2024-11-29 09:30:40.042585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12731870420832074928 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:40.042601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.269 [2024-11-29 09:30:40.042727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:40.042749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.269 #27 NEW cov: 11867 ft: 14331 corp: 13/586b lim: 105 exec/s: 0 rss: 69Mb L: 65/69 MS: 1 CopyPart- 00:07:17.269 [2024-11-29 09:30:40.082072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.269 [2024-11-29 09:30:40.082097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.269 #28 NEW cov: 11867 ft: 14354 corp: 14/626b lim: 105 exec/s: 0 rss: 69Mb L: 40/69 MS: 1 ChangeBinInt- 00:07:17.528 [2024-11-29 09:30:40.122418] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.122446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.122582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.122603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.528 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:17.528 #34 NEW cov: 11890 ft: 14439 corp: 15/672b lim: 105 exec/s: 0 rss: 69Mb L: 46/69 MS: 1 ShuffleBytes- 00:07:17.528 [2024-11-29 09:30:40.162474] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.162504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.162590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.162614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.528 #35 NEW cov: 11890 ft: 14456 corp: 16/717b lim: 105 exec/s: 0 rss: 69Mb L: 45/69 MS: 1 ShuffleBytes- 00:07:17.528 [2024-11-29 09:30:40.202484] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.202509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 #36 NEW cov: 11890 ft: 14474 corp: 17/753b lim: 105 exec/s: 0 rss: 69Mb L: 36/69 MS: 1 EraseBytes- 00:07:17.528 [2024-11-29 09:30:40.242896] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12754194140597629104 len:48305 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.242928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.242979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744072378953983 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.243001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.243116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.243136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.528 #37 NEW cov: 11890 ft: 14492 corp: 18/818b lim: 105 exec/s: 37 rss: 69Mb L: 65/69 MS: 1 CopyPart- 00:07:17.528 [2024-11-29 09:30:40.282832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.282862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.282972] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.282993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.528 #38 NEW cov: 11890 ft: 14507 corp: 19/864b lim: 105 exec/s: 38 rss: 69Mb L: 46/69 MS: 1 CrossOver- 00:07:17.528 [2024-11-29 09:30:40.323022] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.323053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.323159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.323182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.528 #39 NEW cov: 11890 ft: 14539 corp: 20/910b lim: 105 exec/s: 39 rss: 69Mb L: 46/69 MS: 1 ChangeBinInt- 00:07:17.528 [2024-11-29 09:30:40.363058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.363086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.528 [2024-11-29 09:30:40.363186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.528 [2024-11-29 09:30:40.363211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.787 #40 NEW cov: 11890 ft: 14557 corp: 21/956b lim: 105 exec/s: 40 rss: 69Mb L: 46/69 MS: 1 ChangeBit- 00:07:17.787 [2024-11-29 09:30:40.403396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.787 [2024-11-29 09:30:40.403427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.787 [2024-11-29 09:30:40.403515] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12731870420832074928 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.787 [2024-11-29 09:30:40.403541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.787 [2024-11-29 09:30:40.403663] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.787 [2024-11-29 09:30:40.403683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.787 #41 NEW cov: 11890 ft: 14568 corp: 22/1021b lim: 105 exec/s: 41 rss: 69Mb L: 65/69 MS: 1 ChangeByte- 00:07:17.787 [2024-11-29 09:30:40.443351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12735248116437004464 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.787 [2024-11-29 09:30:40.443384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.787 [2024-11-29 09:30:40.443494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.787 [2024-11-29 09:30:40.443519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.787 #42 NEW cov: 11890 ft: 14598 corp: 23/1068b lim: 105 exec/s: 42 rss: 69Mb L: 47/69 MS: 1 CopyPart- 00:07:17.787 [2024-11-29 09:30:40.483356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.787 [2024-11-29 09:30:40.483381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.787 #43 NEW cov: 11890 ft: 14606 corp: 24/1102b lim: 105 exec/s: 43 rss: 69Mb L: 34/69 MS: 1 EraseBytes- 00:07:17.788 [2024-11-29 09:30:40.523505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.523531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.788 #44 NEW cov: 11890 ft: 14621 corp: 25/1142b lim: 105 exec/s: 44 rss: 69Mb L: 40/69 MS: 1 CopyPart- 00:07:17.788 [2024-11-29 09:30:40.564016] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596560832207040688 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.564045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.788 [2024-11-29 09:30:40.564129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744072378953904 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.564154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:17.788 [2024-11-29 09:30:40.564268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18424420349828399103 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.564287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:17.788 [2024-11-29 09:30:40.564411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18427798049548926975 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.564436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:17.788 #45 NEW cov: 11890 ft: 15106 corp: 26/1234b lim: 105 exec/s: 45 rss: 69Mb L: 92/92 MS: 1 CrossOver- 00:07:17.788 [2024-11-29 09:30:40.603837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.603868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:17.788 [2024-11-29 09:30:40.603987] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073697099775 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:17.788 [2024-11-29 09:30:40.604009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.047 #46 NEW cov: 11890 ft: 15149 corp: 27/1280b lim: 105 exec/s: 46 rss: 69Mb L: 46/92 MS: 1 ChangeByte- 00:07:18.047 [2024-11-29 09:30:40.644237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.644269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.644381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12731870420832074928 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.644404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.644525] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744069431361535 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.644548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.047 #47 NEW cov: 11890 ft: 15168 corp: 28/1345b lim: 105 exec/s: 47 rss: 69Mb L: 65/92 MS: 1 ChangeBinInt- 00:07:18.047 [2024-11-29 09:30:40.684277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870419501494448 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.684309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.684428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.684451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.684559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12731870419501494448 len:45312 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.684582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.047 #48 NEW cov: 11890 ft: 15183 corp: 29/1424b lim: 105 exec/s: 48 rss: 69Mb L: 79/92 MS: 1 CopyPart- 00:07:18.047 [2024-11-29 09:30:40.724296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731883610856009904 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.724332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.724445] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.724480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.047 #49 NEW cov: 11890 ft: 15189 corp: 30/1471b lim: 105 exec/s: 49 rss: 69Mb L: 47/92 MS: 1 ChangeBit- 00:07:18.047 [2024-11-29 09:30:40.764212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731688237088682160 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.764239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 #50 NEW cov: 11890 ft: 15192 corp: 31/1497b lim: 105 exec/s: 50 rss: 70Mb L: 26/92 MS: 1 EraseBytes- 00:07:18.047 [2024-11-29 09:30:40.804332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:11578766393179418800 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.804358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 #51 NEW cov: 11890 ft: 15212 corp: 32/1531b lim: 105 exec/s: 51 rss: 70Mb L: 34/92 MS: 1 ChangeBit- 00:07:18.047 [2024-11-29 09:30:40.844782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870416716476592 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.844816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.844937] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.844960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.845082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.845104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.047 #52 NEW cov: 11890 ft: 15250 corp: 33/1600b lim: 105 exec/s: 52 rss: 70Mb L: 69/92 MS: 1 InsertRepeatedBytes- 00:07:18.047 [2024-11-29 09:30:40.884632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12735248116437004464 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.884668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.047 [2024-11-29 09:30:40.884791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.047 [2024-11-29 09:30:40.884811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 #53 NEW cov: 11890 ft: 15261 corp: 34/1647b lim: 105 exec/s: 53 rss: 70Mb L: 47/92 MS: 1 ChangeBinInt- 00:07:18.306 [2024-11-29 09:30:40.924995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:40.925027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:40.925114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12731870420832074928 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:40.925136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:40.925255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:40.925279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.306 #54 NEW cov: 11890 ft: 15282 corp: 35/1712b lim: 105 exec/s: 54 rss: 70Mb L: 65/92 MS: 1 CopyPart- 00:07:18.306 [2024-11-29 09:30:40.964886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12735248116437004464 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:40.964918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:40.965021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:40.965044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 #55 NEW cov: 11890 ft: 15301 corp: 36/1760b lim: 105 exec/s: 55 rss: 70Mb L: 48/92 MS: 1 InsertByte- 00:07:18.306 [2024-11-29 09:30:41.005248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12754194140597629104 len:48305 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.005277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.005359] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744072378953983 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.005381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.005498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.005523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.306 #56 NEW cov: 11890 ft: 15310 corp: 37/1826b lim: 105 exec/s: 56 rss: 70Mb L: 66/92 MS: 1 InsertByte- 00:07:18.306 [2024-11-29 09:30:41.055156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:13596561545171611824 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.055189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.055292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.055313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 #57 NEW cov: 11890 ft: 15315 corp: 38/1872b lim: 105 exec/s: 57 rss: 70Mb L: 46/92 MS: 1 CopyPart- 00:07:18.306 [2024-11-29 09:30:41.095463] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731687897786265776 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.095493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.095579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069431361535 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.095604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.095732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.095753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.306 #58 NEW cov: 11890 ft: 15326 corp: 39/1941b lim: 105 exec/s: 58 rss: 70Mb L: 69/92 MS: 1 ChangeBinInt- 00:07:18.306 [2024-11-29 09:30:41.145878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731870419501494448 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.145910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.146020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.146041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.146157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:12731870419501494448 len:45312 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.146180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:18.306 [2024-11-29 09:30:41.146299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.306 [2024-11-29 09:30:41.146319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:18.566 #59 NEW cov: 11890 ft: 15403 corp: 40/2025b lim: 105 exec/s: 59 rss: 70Mb L: 84/92 MS: 1 InsertRepeatedBytes- 00:07:18.566 [2024-11-29 09:30:41.195540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.566 [2024-11-29 09:30:41.195572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.566 [2024-11-29 09:30:41.195691] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.566 [2024-11-29 09:30:41.195710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.566 #60 NEW cov: 11890 ft: 15414 corp: 41/2080b lim: 105 exec/s: 60 rss: 70Mb L: 55/92 MS: 1 InsertByte- 00:07:18.566 [2024-11-29 09:30:41.235676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12731883606561042608 len:45233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.566 [2024-11-29 09:30:41.235706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:18.566 [2024-11-29 09:30:41.235815] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:18.566 [2024-11-29 09:30:41.235837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:18.566 #61 NEW cov: 11890 ft: 15433 corp: 42/2127b lim: 105 exec/s: 30 rss: 70Mb L: 47/92 MS: 1 ChangeBinInt- 00:07:18.566 #61 DONE cov: 11890 ft: 15433 corp: 42/2127b lim: 105 exec/s: 30 rss: 70Mb 00:07:18.566 Done 61 runs in 2 second(s) 00:07:18.566 09:30:41 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:07:18.566 09:30:41 -- ../common.sh@72 -- # (( i++ )) 00:07:18.566 09:30:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:18.566 09:30:41 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:07:18.566 09:30:41 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:07:18.566 09:30:41 -- nvmf/run.sh@24 -- # local timen=1 00:07:18.566 09:30:41 -- nvmf/run.sh@25 -- # local core=0x1 00:07:18.566 09:30:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:18.566 09:30:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:07:18.566 09:30:41 -- nvmf/run.sh@29 -- # printf %02d 17 00:07:18.566 09:30:41 -- nvmf/run.sh@29 -- # port=4417 00:07:18.566 09:30:41 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:18.566 09:30:41 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:07:18.566 09:30:41 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:18.566 09:30:41 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:07:18.825 [2024-11-29 09:30:41.422291] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:18.825 [2024-11-29 09:30:41.422358] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3186591 ] 00:07:18.825 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.084 [2024-11-29 09:30:41.675524] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.084 [2024-11-29 09:30:41.765810] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:19.084 [2024-11-29 09:30:41.765952] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.084 [2024-11-29 09:30:41.823728] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:19.084 [2024-11-29 09:30:41.839969] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:07:19.084 INFO: Running with entropic power schedule (0xFF, 100). 00:07:19.084 INFO: Seed: 1648850946 00:07:19.084 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:19.084 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:19.084 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:07:19.084 INFO: A corpus is not provided, starting from an empty corpus 00:07:19.084 #2 INITED exec/s: 0 rss: 60Mb 00:07:19.084 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:19.084 This may also happen if the target rejected all inputs we tried so far 00:07:19.084 [2024-11-29 09:30:41.895556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.084 [2024-11-29 09:30:41.895586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.084 [2024-11-29 09:30:41.895630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.084 [2024-11-29 09:30:41.895646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.085 [2024-11-29 09:30:41.895696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.085 [2024-11-29 09:30:41.895712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.085 [2024-11-29 09:30:41.895763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.085 [2024-11-29 09:30:41.895778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.653 NEW_FUNC[1/672]: 0x4545f8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:07:19.653 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:19.653 #8 NEW cov: 11684 ft: 11685 corp: 2/115b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 InsertRepeatedBytes- 00:07:19.653 [2024-11-29 09:30:42.215951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.215990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 #12 NEW cov: 11797 ft: 13052 corp: 3/155b lim: 120 exec/s: 0 rss: 68Mb L: 40/114 MS: 4 InsertByte-ChangeByte-CMP-InsertRepeatedBytes- DE: "\000\000\000\000\000\000\000\000"- 00:07:19.653 [2024-11-29 09:30:42.256426] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.256455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.256506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.256522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.256574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.256589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.256647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.256662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.653 #13 NEW cov: 11803 ft: 13303 corp: 4/269b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:19.653 [2024-11-29 09:30:42.296521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.296550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.296589] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.296609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.296661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.296676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.296731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.296746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.653 #19 NEW cov: 11888 ft: 13599 corp: 5/383b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ChangeByte- 00:07:19.653 [2024-11-29 09:30:42.346253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.346282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 #20 NEW cov: 11888 ft: 13782 corp: 6/423b lim: 120 exec/s: 0 rss: 68Mb L: 40/114 MS: 1 ChangeByte- 00:07:19.653 [2024-11-29 09:30:42.386792] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4611686018595160064 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.386820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.386868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.386883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.386933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.386948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.386999] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.387014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.653 #21 NEW cov: 11888 ft: 13851 corp: 7/537b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ChangeBit- 00:07:19.653 [2024-11-29 09:30:42.426915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:13052674048 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.426943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.426977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.426993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.427042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.427060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.427114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.427129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.653 #22 NEW cov: 11888 ft: 13938 corp: 8/651b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ChangeBinInt- 00:07:19.653 [2024-11-29 09:30:42.466998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.467025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.467088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.467103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.467157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.467173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.653 [2024-11-29 09:30:42.467225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.653 [2024-11-29 09:30:42.467240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.653 #23 NEW cov: 11888 ft: 14011 corp: 9/765b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ChangeBinInt- 00:07:19.913 [2024-11-29 09:30:42.507136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.507163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.507212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.507227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.507282] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.507297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.507350] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.507364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.913 #24 NEW cov: 11888 ft: 14070 corp: 10/879b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 CrossOver- 00:07:19.913 [2024-11-29 09:30:42.547253] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.547282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.547341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.547357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.547412] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.547428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.547482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.547497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.913 #25 NEW cov: 11888 ft: 14105 corp: 11/993b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:19.913 [2024-11-29 09:30:42.587362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.587389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.587432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.587447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.587497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.587512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.587564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.587578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.913 #26 NEW cov: 11888 ft: 14132 corp: 12/1107b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:19.913 [2024-11-29 09:30:42.627472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.627499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.627562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.627577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.627624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.627639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.627692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:71776123339472895 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.627706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.913 #27 NEW cov: 11888 ft: 14187 corp: 13/1221b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:19.913 [2024-11-29 09:30:42.667596] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.667627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.667681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.667698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.667751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.667765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.667817] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.667833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:19.913 #28 NEW cov: 11888 ft: 14216 corp: 14/1335b lim: 120 exec/s: 0 rss: 68Mb L: 114/114 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:19.913 [2024-11-29 09:30:42.707587] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.707620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.707687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.707703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.913 [2024-11-29 09:30:42.707756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.913 [2024-11-29 09:30:42.707780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:19.913 #29 NEW cov: 11888 ft: 14579 corp: 15/1414b lim: 120 exec/s: 0 rss: 68Mb L: 79/114 MS: 1 EraseBytes- 00:07:19.914 [2024-11-29 09:30:42.747698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.914 [2024-11-29 09:30:42.747725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:19.914 [2024-11-29 09:30:42.747763] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.914 [2024-11-29 09:30:42.747776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:19.914 [2024-11-29 09:30:42.747827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1157627904 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:19.914 [2024-11-29 09:30:42.747842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.173 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:20.173 #30 NEW cov: 11911 ft: 14594 corp: 16/1498b lim: 120 exec/s: 0 rss: 69Mb L: 84/114 MS: 1 InsertRepeatedBytes- 00:07:20.173 [2024-11-29 09:30:42.797975] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.798003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.798052] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.798066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.798118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.798135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.798190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.798205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.173 #31 NEW cov: 11911 ft: 14609 corp: 17/1612b lim: 120 exec/s: 0 rss: 69Mb L: 114/114 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:07:20.173 [2024-11-29 09:30:42.838118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1158283264 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.838145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.838192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.838207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.838258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:250 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.838272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.838324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.838338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.173 #36 NEW cov: 11911 ft: 14625 corp: 18/1709b lim: 120 exec/s: 0 rss: 69Mb L: 97/114 MS: 5 CrossOver-InsertByte-ChangeBit-CopyPart-CrossOver- 00:07:20.173 [2024-11-29 09:30:42.878199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.878226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.878274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.878289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.878342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.878356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.878410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.878425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.173 #37 NEW cov: 11911 ft: 14671 corp: 19/1823b lim: 120 exec/s: 37 rss: 69Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:20.173 [2024-11-29 09:30:42.917856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.917883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.173 #38 NEW cov: 11911 ft: 14829 corp: 20/1863b lim: 120 exec/s: 38 rss: 69Mb L: 40/114 MS: 1 CopyPart- 00:07:20.173 [2024-11-29 09:30:42.958284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.958315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.958351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.958366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.173 [2024-11-29 09:30:42.958417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1157627904 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.958431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.173 #39 NEW cov: 11911 ft: 14842 corp: 21/1947b lim: 120 exec/s: 39 rss: 69Mb L: 84/114 MS: 1 ChangeBinInt- 00:07:20.173 [2024-11-29 09:30:42.998111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.173 [2024-11-29 09:30:42.998138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.432 #40 NEW cov: 11911 ft: 14867 corp: 22/1987b lim: 120 exec/s: 40 rss: 69Mb L: 40/114 MS: 1 ChangeBinInt- 00:07:20.432 [2024-11-29 09:30:43.038687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.038714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.038762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.038777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.038828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.038843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.038895] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.038909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.432 #41 NEW cov: 11911 ft: 14919 corp: 23/2101b lim: 120 exec/s: 41 rss: 70Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:20.432 [2024-11-29 09:30:43.078796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.078825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.078861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.078877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.078931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.078946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.078996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.079010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.432 #42 NEW cov: 11911 ft: 14925 corp: 24/2215b lim: 120 exec/s: 42 rss: 70Mb L: 114/114 MS: 1 ChangeBit- 00:07:20.432 [2024-11-29 09:30:43.118925] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.118953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.118992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.119006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.119057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.119072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.432 [2024-11-29 09:30:43.119124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.432 [2024-11-29 09:30:43.119139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.432 #43 NEW cov: 11911 ft: 14962 corp: 25/2329b lim: 120 exec/s: 43 rss: 70Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:20.432 [2024-11-29 09:30:43.159053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4462739456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.159080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.159128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.159142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.159193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.159208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.159261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.159276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.433 #44 NEW cov: 11911 ft: 14971 corp: 26/2443b lim: 120 exec/s: 44 rss: 70Mb L: 114/114 MS: 1 CMP- DE: "\001\004"- 00:07:20.433 [2024-11-29 09:30:43.199008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4611686018595160064 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.199036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.199072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.199087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.199142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.199157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.433 #45 NEW cov: 11911 ft: 15067 corp: 27/2516b lim: 120 exec/s: 45 rss: 70Mb L: 73/114 MS: 1 EraseBytes- 00:07:20.433 [2024-11-29 09:30:43.239285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.239312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.239361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.239376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.239428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.239442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.433 [2024-11-29 09:30:43.239494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.433 [2024-11-29 09:30:43.239508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.433 #46 NEW cov: 11911 ft: 15125 corp: 28/2630b lim: 120 exec/s: 46 rss: 70Mb L: 114/114 MS: 1 ChangeBit- 00:07:20.692 [2024-11-29 09:30:43.278949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.692 [2024-11-29 09:30:43.278976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.692 #47 NEW cov: 11911 ft: 15161 corp: 29/2665b lim: 120 exec/s: 47 rss: 70Mb L: 35/114 MS: 1 EraseBytes- 00:07:20.692 [2024-11-29 09:30:43.319490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.692 [2024-11-29 09:30:43.319517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.692 [2024-11-29 09:30:43.319565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.692 [2024-11-29 09:30:43.319581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.319651] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.319665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.319719] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.319734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.693 #48 NEW cov: 11911 ft: 15168 corp: 30/2779b lim: 120 exec/s: 48 rss: 70Mb L: 114/114 MS: 1 ShuffleBytes- 00:07:20.693 [2024-11-29 09:30:43.359608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.359635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.359684] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.359699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.359750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.359768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.359820] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967040 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.359835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.693 #49 NEW cov: 11911 ft: 15192 corp: 31/2893b lim: 120 exec/s: 49 rss: 70Mb L: 114/114 MS: 1 ChangeByte- 00:07:20.693 [2024-11-29 09:30:43.399701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.399728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.399777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.399792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.399845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.399860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.399913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.399927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.693 #50 NEW cov: 11911 ft: 15198 corp: 32/3007b lim: 120 exec/s: 50 rss: 70Mb L: 114/114 MS: 1 CopyPart- 00:07:20.693 [2024-11-29 09:30:43.439909] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.439937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.439992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.440008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.440061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.440076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.440132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:330712481792 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.440146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.693 #51 NEW cov: 11911 ft: 15199 corp: 33/3122b lim: 120 exec/s: 51 rss: 70Mb L: 115/115 MS: 1 InsertByte- 00:07:20.693 [2024-11-29 09:30:43.480036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.480063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.480124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.480140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.480194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.480210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.480264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.480279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.693 #52 NEW cov: 11911 ft: 15211 corp: 34/3236b lim: 120 exec/s: 52 rss: 70Mb L: 114/115 MS: 1 ChangeBit- 00:07:20.693 [2024-11-29 09:30:43.520116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.520144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.520191] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.520207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.520259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.520273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.693 [2024-11-29 09:30:43.520326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.693 [2024-11-29 09:30:43.520341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.952 #53 NEW cov: 11911 ft: 15226 corp: 35/3350b lim: 120 exec/s: 53 rss: 70Mb L: 114/115 MS: 1 CrossOver- 00:07:20.952 [2024-11-29 09:30:43.560200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4462739456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.952 [2024-11-29 09:30:43.560228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.952 [2024-11-29 09:30:43.560291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.952 [2024-11-29 09:30:43.560308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.952 [2024-11-29 09:30:43.560358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.952 [2024-11-29 09:30:43.560374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.952 [2024-11-29 09:30:43.560428] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.560443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.953 #54 NEW cov: 11911 ft: 15263 corp: 36/3446b lim: 120 exec/s: 54 rss: 70Mb L: 96/115 MS: 1 EraseBytes- 00:07:20.953 [2024-11-29 09:30:43.599864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.599890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.953 #55 NEW cov: 11911 ft: 15290 corp: 37/3486b lim: 120 exec/s: 55 rss: 70Mb L: 40/115 MS: 1 ChangeBit- 00:07:20.953 [2024-11-29 09:30:43.640264] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.640291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.640328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.640343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.640396] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.640411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.953 #56 NEW cov: 11911 ft: 15296 corp: 38/3558b lim: 120 exec/s: 56 rss: 70Mb L: 72/115 MS: 1 EraseBytes- 00:07:20.953 [2024-11-29 09:30:43.680565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4462739456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.680593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.680645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.680661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.680714] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.680730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.680782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.680795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.953 #57 NEW cov: 11911 ft: 15301 corp: 39/3672b lim: 120 exec/s: 57 rss: 70Mb L: 114/115 MS: 1 PersAutoDict- DE: "\001\004"- 00:07:20.953 [2024-11-29 09:30:43.720661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:4462739456 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.720687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.720750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.720766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.720821] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.720836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.720890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:84662395338752 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.720905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.953 #58 NEW cov: 11911 ft: 15307 corp: 40/3787b lim: 120 exec/s: 58 rss: 70Mb L: 115/115 MS: 1 InsertByte- 00:07:20.953 [2024-11-29 09:30:43.760797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.760828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.760864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.760880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.760933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.760948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:20.953 [2024-11-29 09:30:43.761000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:20.953 [2024-11-29 09:30:43.761015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:20.953 #59 NEW cov: 11911 ft: 15340 corp: 41/3889b lim: 120 exec/s: 59 rss: 70Mb L: 102/115 MS: 1 InsertRepeatedBytes- 00:07:21.213 [2024-11-29 09:30:43.800899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.800927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.213 [2024-11-29 09:30:43.800970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.800985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.213 [2024-11-29 09:30:43.801039] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.801054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.213 [2024-11-29 09:30:43.801108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:4294967295 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.801123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:21.213 #60 NEW cov: 11911 ft: 15369 corp: 42/4004b lim: 120 exec/s: 60 rss: 70Mb L: 115/115 MS: 1 InsertByte- 00:07:21.213 [2024-11-29 09:30:43.840835] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:261 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.840863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.213 [2024-11-29 09:30:43.840900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.840915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.213 [2024-11-29 09:30:43.840968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.840984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.213 #61 NEW cov: 11911 ft: 15382 corp: 43/4078b lim: 120 exec/s: 61 rss: 70Mb L: 74/115 MS: 1 PersAutoDict- DE: "\001\004"- 00:07:21.213 [2024-11-29 09:30:43.880813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.880841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.213 [2024-11-29 09:30:43.880897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:21.213 [2024-11-29 09:30:43.880913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.213 #62 NEW cov: 11911 ft: 15726 corp: 44/4140b lim: 120 exec/s: 31 rss: 70Mb L: 62/115 MS: 1 EraseBytes- 00:07:21.213 #62 DONE cov: 11911 ft: 15726 corp: 44/4140b lim: 120 exec/s: 31 rss: 70Mb 00:07:21.213 ###### Recommended dictionary. ###### 00:07:21.213 "\000\000\000\000\000\000\000\000" # Uses: 3 00:07:21.213 "\001\004" # Uses: 2 00:07:21.213 ###### End of recommended dictionary. ###### 00:07:21.213 Done 62 runs in 2 second(s) 00:07:21.213 09:30:44 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:07:21.213 09:30:44 -- ../common.sh@72 -- # (( i++ )) 00:07:21.213 09:30:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:21.213 09:30:44 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:07:21.213 09:30:44 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:07:21.213 09:30:44 -- nvmf/run.sh@24 -- # local timen=1 00:07:21.213 09:30:44 -- nvmf/run.sh@25 -- # local core=0x1 00:07:21.213 09:30:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:21.213 09:30:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:07:21.213 09:30:44 -- nvmf/run.sh@29 -- # printf %02d 18 00:07:21.213 09:30:44 -- nvmf/run.sh@29 -- # port=4418 00:07:21.213 09:30:44 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:21.213 09:30:44 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:07:21.213 09:30:44 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:21.213 09:30:44 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:07:21.473 [2024-11-29 09:30:44.073629] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:21.473 [2024-11-29 09:30:44.073717] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3186996 ] 00:07:21.473 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.732 [2024-11-29 09:30:44.327351] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.732 [2024-11-29 09:30:44.410515] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.732 [2024-11-29 09:30:44.410656] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.732 [2024-11-29 09:30:44.468883] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:21.732 [2024-11-29 09:30:44.485201] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:07:21.732 INFO: Running with entropic power schedule (0xFF, 100). 00:07:21.732 INFO: Seed: 4293847149 00:07:21.732 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:21.732 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:21.732 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:07:21.732 INFO: A corpus is not provided, starting from an empty corpus 00:07:21.732 #2 INITED exec/s: 0 rss: 60Mb 00:07:21.732 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:21.732 This may also happen if the target rejected all inputs we tried so far 00:07:21.732 [2024-11-29 09:30:44.529893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:21.732 [2024-11-29 09:30:44.529926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:21.732 [2024-11-29 09:30:44.529978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:21.732 [2024-11-29 09:30:44.529994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:21.732 [2024-11-29 09:30:44.530021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:21.732 [2024-11-29 09:30:44.530036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:21.732 [2024-11-29 09:30:44.530062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:21.732 [2024-11-29 09:30:44.530076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.250 NEW_FUNC[1/670]: 0x457e58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:07:22.250 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:22.250 #12 NEW cov: 11628 ft: 11629 corp: 2/97b lim: 100 exec/s: 0 rss: 68Mb L: 96/96 MS: 5 CopyPart-ChangeBit-CrossOver-ChangeByte-InsertRepeatedBytes- 00:07:22.251 [2024-11-29 09:30:44.850644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.251 [2024-11-29 09:30:44.850679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.850727] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.251 [2024-11-29 09:30:44.850743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.850772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.251 [2024-11-29 09:30:44.850786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.850813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.251 [2024-11-29 09:30:44.850827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.251 #13 NEW cov: 11741 ft: 12117 corp: 3/193b lim: 100 exec/s: 0 rss: 68Mb L: 96/96 MS: 1 CMP- DE: "\001\223\317\334\226\306?H"- 00:07:22.251 [2024-11-29 09:30:44.920728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.251 [2024-11-29 09:30:44.920767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.920812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.251 [2024-11-29 09:30:44.920827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.920855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.251 [2024-11-29 09:30:44.920870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.920897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.251 [2024-11-29 09:30:44.920911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.251 #14 NEW cov: 11747 ft: 12435 corp: 4/289b lim: 100 exec/s: 0 rss: 68Mb L: 96/96 MS: 1 ChangeByte- 00:07:22.251 [2024-11-29 09:30:44.970779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.251 [2024-11-29 09:30:44.970806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.970856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.251 [2024-11-29 09:30:44.970872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:44.970900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.251 [2024-11-29 09:30:44.970914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.251 #15 NEW cov: 11832 ft: 13046 corp: 5/362b lim: 100 exec/s: 0 rss: 68Mb L: 73/96 MS: 1 EraseBytes- 00:07:22.251 [2024-11-29 09:30:45.041042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.251 [2024-11-29 09:30:45.041071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:45.041102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.251 [2024-11-29 09:30:45.041118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:45.041146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.251 [2024-11-29 09:30:45.041161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:45.041187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.251 [2024-11-29 09:30:45.041201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.251 #16 NEW cov: 11832 ft: 13132 corp: 6/459b lim: 100 exec/s: 0 rss: 68Mb L: 97/97 MS: 1 CrossOver- 00:07:22.251 [2024-11-29 09:30:45.091223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.251 [2024-11-29 09:30:45.091252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:45.091284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.251 [2024-11-29 09:30:45.091300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:45.091329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.251 [2024-11-29 09:30:45.091343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.251 [2024-11-29 09:30:45.091371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.251 [2024-11-29 09:30:45.091385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.510 #17 NEW cov: 11832 ft: 13259 corp: 7/556b lim: 100 exec/s: 0 rss: 68Mb L: 97/97 MS: 1 InsertByte- 00:07:22.510 [2024-11-29 09:30:45.161341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.510 [2024-11-29 09:30:45.161369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.161415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.510 [2024-11-29 09:30:45.161430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.161459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.510 [2024-11-29 09:30:45.161472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.161503] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.510 [2024-11-29 09:30:45.161517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.510 #18 NEW cov: 11832 ft: 13301 corp: 8/652b lim: 100 exec/s: 0 rss: 68Mb L: 96/97 MS: 1 CrossOver- 00:07:22.510 [2024-11-29 09:30:45.211434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.510 [2024-11-29 09:30:45.211462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.211492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.510 [2024-11-29 09:30:45.211523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.211551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.510 [2024-11-29 09:30:45.211565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.211592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.510 [2024-11-29 09:30:45.211614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.510 #19 NEW cov: 11832 ft: 13402 corp: 9/741b lim: 100 exec/s: 0 rss: 68Mb L: 89/97 MS: 1 InsertRepeatedBytes- 00:07:22.510 [2024-11-29 09:30:45.271518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.510 [2024-11-29 09:30:45.271545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.510 #22 NEW cov: 11832 ft: 13780 corp: 10/772b lim: 100 exec/s: 0 rss: 68Mb L: 31/97 MS: 3 CopyPart-ChangeByte-CrossOver- 00:07:22.510 [2024-11-29 09:30:45.331740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.510 [2024-11-29 09:30:45.331769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.510 [2024-11-29 09:30:45.331802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.510 [2024-11-29 09:30:45.331818] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.768 #23 NEW cov: 11832 ft: 14056 corp: 11/829b lim: 100 exec/s: 0 rss: 68Mb L: 57/97 MS: 1 EraseBytes- 00:07:22.768 [2024-11-29 09:30:45.391834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.769 [2024-11-29 09:30:45.391861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.769 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:22.769 #24 NEW cov: 11849 ft: 14133 corp: 12/860b lim: 100 exec/s: 0 rss: 68Mb L: 31/97 MS: 1 ChangeBinInt- 00:07:22.769 [2024-11-29 09:30:45.462195] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.769 [2024-11-29 09:30:45.462223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.462270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.769 [2024-11-29 09:30:45.462286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.462315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.769 [2024-11-29 09:30:45.462329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.462361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.769 [2024-11-29 09:30:45.462376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.769 #25 NEW cov: 11849 ft: 14201 corp: 13/957b lim: 100 exec/s: 0 rss: 69Mb L: 97/97 MS: 1 ShuffleBytes- 00:07:22.769 [2024-11-29 09:30:45.532362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.769 [2024-11-29 09:30:45.532391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.532422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.769 [2024-11-29 09:30:45.532438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.532466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.769 [2024-11-29 09:30:45.532480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.532507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.769 [2024-11-29 09:30:45.532520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:22.769 #26 NEW cov: 11849 ft: 14211 corp: 14/1054b lim: 100 exec/s: 26 rss: 69Mb L: 97/97 MS: 1 ChangeByte- 00:07:22.769 [2024-11-29 09:30:45.602496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:22.769 [2024-11-29 09:30:45.602524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.602570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:22.769 [2024-11-29 09:30:45.602586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.602621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:22.769 [2024-11-29 09:30:45.602636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:22.769 [2024-11-29 09:30:45.602663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:22.769 [2024-11-29 09:30:45.602677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.028 #27 NEW cov: 11849 ft: 14267 corp: 15/1152b lim: 100 exec/s: 27 rss: 69Mb L: 98/98 MS: 1 InsertByte- 00:07:23.028 [2024-11-29 09:30:45.672723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.028 [2024-11-29 09:30:45.672762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.672806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.028 [2024-11-29 09:30:45.672822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.672850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.028 [2024-11-29 09:30:45.672864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.672891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.028 [2024-11-29 09:30:45.672904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.028 #28 NEW cov: 11849 ft: 14293 corp: 16/1249b lim: 100 exec/s: 28 rss: 69Mb L: 97/98 MS: 1 ShuffleBytes- 00:07:23.028 [2024-11-29 09:30:45.722819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.028 [2024-11-29 09:30:45.722847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.722879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.028 [2024-11-29 09:30:45.722895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.722922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.028 [2024-11-29 09:30:45.722936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.722963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.028 [2024-11-29 09:30:45.722993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.028 #29 NEW cov: 11849 ft: 14338 corp: 17/1345b lim: 100 exec/s: 29 rss: 69Mb L: 96/98 MS: 1 ShuffleBytes- 00:07:23.028 [2024-11-29 09:30:45.783647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.028 [2024-11-29 09:30:45.783673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.783726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.028 [2024-11-29 09:30:45.783740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.783799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.028 [2024-11-29 09:30:45.783813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.783862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.028 [2024-11-29 09:30:45.783876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.028 #30 NEW cov: 11849 ft: 14497 corp: 18/1442b lim: 100 exec/s: 30 rss: 69Mb L: 97/98 MS: 1 CopyPart- 00:07:23.028 [2024-11-29 09:30:45.823647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.028 [2024-11-29 09:30:45.823673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.823726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.028 [2024-11-29 09:30:45.823741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.823801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.028 [2024-11-29 09:30:45.823815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.028 #31 NEW cov: 11849 ft: 14550 corp: 19/1515b lim: 100 exec/s: 31 rss: 69Mb L: 73/98 MS: 1 CopyPart- 00:07:23.028 [2024-11-29 09:30:45.863853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.028 [2024-11-29 09:30:45.863878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.863919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.028 [2024-11-29 09:30:45.863933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.863985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.028 [2024-11-29 09:30:45.863999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.028 [2024-11-29 09:30:45.864047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.028 [2024-11-29 09:30:45.864060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.288 #32 NEW cov: 11849 ft: 14566 corp: 20/1612b lim: 100 exec/s: 32 rss: 69Mb L: 97/98 MS: 1 InsertByte- 00:07:23.288 [2024-11-29 09:30:45.903944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.288 [2024-11-29 09:30:45.903969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.904010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.288 [2024-11-29 09:30:45.904024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.904072] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.288 [2024-11-29 09:30:45.904086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.904133] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.288 [2024-11-29 09:30:45.904146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.288 #33 NEW cov: 11849 ft: 14579 corp: 21/1709b lim: 100 exec/s: 33 rss: 69Mb L: 97/98 MS: 1 PersAutoDict- DE: "\001\223\317\334\226\306?H"- 00:07:23.288 [2024-11-29 09:30:45.944116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.288 [2024-11-29 09:30:45.944142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.944186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.288 [2024-11-29 09:30:45.944199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.944246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.288 [2024-11-29 09:30:45.944260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.944307] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.288 [2024-11-29 09:30:45.944322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.288 #34 NEW cov: 11849 ft: 14592 corp: 22/1807b lim: 100 exec/s: 34 rss: 69Mb L: 98/98 MS: 1 CrossOver- 00:07:23.288 [2024-11-29 09:30:45.984224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.288 [2024-11-29 09:30:45.984249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.984293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.288 [2024-11-29 09:30:45.984307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.984354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.288 [2024-11-29 09:30:45.984367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:45.984437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.288 [2024-11-29 09:30:45.984450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.288 #35 NEW cov: 11849 ft: 14619 corp: 23/1903b lim: 100 exec/s: 35 rss: 69Mb L: 96/98 MS: 1 ChangeBinInt- 00:07:23.288 [2024-11-29 09:30:46.024093] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.288 [2024-11-29 09:30:46.024117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:46.024168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.288 [2024-11-29 09:30:46.024181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.288 #36 NEW cov: 11849 ft: 14672 corp: 24/1960b lim: 100 exec/s: 36 rss: 69Mb L: 57/98 MS: 1 PersAutoDict- DE: "\001\223\317\334\226\306?H"- 00:07:23.288 [2024-11-29 09:30:46.064241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.288 [2024-11-29 09:30:46.064266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.288 [2024-11-29 09:30:46.064328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.288 [2024-11-29 09:30:46.064340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.288 #37 NEW cov: 11849 ft: 14695 corp: 25/2017b lim: 100 exec/s: 37 rss: 69Mb L: 57/98 MS: 1 CrossOver- 00:07:23.288 [2024-11-29 09:30:46.104260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.288 [2024-11-29 09:30:46.104285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.288 #42 NEW cov: 11849 ft: 14710 corp: 26/2053b lim: 100 exec/s: 42 rss: 69Mb L: 36/98 MS: 5 ChangeByte-ChangeByte-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:07:23.548 [2024-11-29 09:30:46.144559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.144584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.144633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.548 [2024-11-29 09:30:46.144648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.144697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.548 [2024-11-29 09:30:46.144710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.548 #43 NEW cov: 11849 ft: 14724 corp: 27/2126b lim: 100 exec/s: 43 rss: 69Mb L: 73/98 MS: 1 ChangeByte- 00:07:23.548 [2024-11-29 09:30:46.184481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.184506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 #44 NEW cov: 11849 ft: 14740 corp: 28/2157b lim: 100 exec/s: 44 rss: 69Mb L: 31/98 MS: 1 ChangeBinInt- 00:07:23.548 [2024-11-29 09:30:46.224770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.224794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.224835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.548 [2024-11-29 09:30:46.224848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.224898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.548 [2024-11-29 09:30:46.224911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.548 #45 NEW cov: 11849 ft: 14769 corp: 29/2225b lim: 100 exec/s: 45 rss: 69Mb L: 68/98 MS: 1 EraseBytes- 00:07:23.548 [2024-11-29 09:30:46.264842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.264866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.264914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.548 [2024-11-29 09:30:46.264928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.548 #46 NEW cov: 11849 ft: 14773 corp: 30/2283b lim: 100 exec/s: 46 rss: 69Mb L: 58/98 MS: 1 InsertByte- 00:07:23.548 [2024-11-29 09:30:46.305006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.305032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.305068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.548 [2024-11-29 09:30:46.305082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.305132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.548 [2024-11-29 09:30:46.305146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.548 #47 NEW cov: 11849 ft: 14787 corp: 31/2352b lim: 100 exec/s: 47 rss: 69Mb L: 69/98 MS: 1 InsertRepeatedBytes- 00:07:23.548 [2024-11-29 09:30:46.344912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.344937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 #48 NEW cov: 11849 ft: 14814 corp: 32/2388b lim: 100 exec/s: 48 rss: 69Mb L: 36/98 MS: 1 ShuffleBytes- 00:07:23.548 [2024-11-29 09:30:46.385513] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.548 [2024-11-29 09:30:46.385539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.385587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.548 [2024-11-29 09:30:46.385605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.385656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.548 [2024-11-29 09:30:46.385670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.385717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.548 [2024-11-29 09:30:46.385731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.548 [2024-11-29 09:30:46.385779] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:07:23.548 [2024-11-29 09:30:46.385793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:23.808 #49 NEW cov: 11849 ft: 14844 corp: 33/2488b lim: 100 exec/s: 49 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:07:23.808 [2024-11-29 09:30:46.425453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.808 [2024-11-29 09:30:46.425477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.425522] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.808 [2024-11-29 09:30:46.425536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.425585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.808 [2024-11-29 09:30:46.425602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.425668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.808 [2024-11-29 09:30:46.425682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.808 #50 NEW cov: 11856 ft: 14854 corp: 34/2585b lim: 100 exec/s: 50 rss: 69Mb L: 97/100 MS: 1 CopyPart- 00:07:23.808 [2024-11-29 09:30:46.465596] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.808 [2024-11-29 09:30:46.465626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.465690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.808 [2024-11-29 09:30:46.465703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.465754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:07:23.808 [2024-11-29 09:30:46.465767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.465816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:07:23.808 [2024-11-29 09:30:46.465830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:23.808 #51 NEW cov: 11856 ft: 14863 corp: 35/2681b lim: 100 exec/s: 51 rss: 70Mb L: 96/100 MS: 1 ShuffleBytes- 00:07:23.808 [2024-11-29 09:30:46.505469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:07:23.808 [2024-11-29 09:30:46.505494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:23.808 [2024-11-29 09:30:46.505528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:07:23.808 [2024-11-29 09:30:46.505541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:23.808 #52 NEW cov: 11856 ft: 14866 corp: 36/2738b lim: 100 exec/s: 26 rss: 70Mb L: 57/100 MS: 1 ShuffleBytes- 00:07:23.808 #52 DONE cov: 11856 ft: 14866 corp: 36/2738b lim: 100 exec/s: 26 rss: 70Mb 00:07:23.808 ###### Recommended dictionary. ###### 00:07:23.808 "\001\223\317\334\226\306?H" # Uses: 2 00:07:23.808 ###### End of recommended dictionary. ###### 00:07:23.808 Done 52 runs in 2 second(s) 00:07:23.808 09:30:46 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:07:23.808 09:30:46 -- ../common.sh@72 -- # (( i++ )) 00:07:23.808 09:30:46 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:23.808 09:30:46 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:07:23.808 09:30:46 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:07:23.808 09:30:46 -- nvmf/run.sh@24 -- # local timen=1 00:07:23.808 09:30:46 -- nvmf/run.sh@25 -- # local core=0x1 00:07:23.808 09:30:46 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:23.808 09:30:46 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:07:24.068 09:30:46 -- nvmf/run.sh@29 -- # printf %02d 19 00:07:24.068 09:30:46 -- nvmf/run.sh@29 -- # port=4419 00:07:24.068 09:30:46 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:24.068 09:30:46 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:07:24.068 09:30:46 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.068 09:30:46 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:07:24.068 [2024-11-29 09:30:46.688522] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:24.068 [2024-11-29 09:30:46.688594] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3187428 ] 00:07:24.068 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.327 [2024-11-29 09:30:46.939737] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.327 [2024-11-29 09:30:47.025214] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.327 [2024-11-29 09:30:47.025357] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.327 [2024-11-29 09:30:47.083383] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.327 [2024-11-29 09:30:47.099622] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:07:24.327 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.327 INFO: Seed: 2613865249 00:07:24.327 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:24.327 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:24.327 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:07:24.327 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.328 #2 INITED exec/s: 0 rss: 60Mb 00:07:24.328 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.328 This may also happen if the target rejected all inputs we tried so far 00:07:24.328 [2024-11-29 09:30:47.154374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:24.328 [2024-11-29 09:30:47.154408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.328 [2024-11-29 09:30:47.154456] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:24.328 [2024-11-29 09:30:47.154474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.328 [2024-11-29 09:30:47.154504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:24.328 [2024-11-29 09:30:47.154520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.847 NEW_FUNC[1/670]: 0x45ae18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:07:24.847 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.847 #14 NEW cov: 11606 ft: 11607 corp: 2/34b lim: 50 exec/s: 0 rss: 68Mb L: 33/33 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:24.847 [2024-11-29 09:30:47.475013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:791617290 len:1 00:07:24.847 [2024-11-29 09:30:47.475050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.847 #19 NEW cov: 11719 ft: 12349 corp: 3/46b lim: 50 exec/s: 0 rss: 69Mb L: 12/33 MS: 5 ChangeByte-ChangeBit-InsertByte-CopyPart-CMP- DE: "\012\000\000\000\000\000\000\000"- 00:07:24.847 [2024-11-29 09:30:47.525190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:281471473360650 len:65536 00:07:24.847 [2024-11-29 09:30:47.525220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.525266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:24.847 [2024-11-29 09:30:47.525283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.525312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:24.847 [2024-11-29 09:30:47.525328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.525355] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446462603027808255 len:1 00:07:24.847 [2024-11-29 09:30:47.525370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.847 #20 NEW cov: 11725 ft: 12902 corp: 4/88b lim: 50 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 InsertRepeatedBytes- 00:07:24.847 [2024-11-29 09:30:47.585349] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:24.847 [2024-11-29 09:30:47.585379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.585411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:24.847 [2024-11-29 09:30:47.585429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.585458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:24.847 [2024-11-29 09:30:47.585474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.585501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:170 00:07:24.847 [2024-11-29 09:30:47.585516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.847 #24 NEW cov: 11810 ft: 13235 corp: 5/134b lim: 50 exec/s: 0 rss: 69Mb L: 46/46 MS: 4 CrossOver-CrossOver-CMP-CrossOver- DE: "}\251\303-\336\317\223\000"- 00:07:24.847 [2024-11-29 09:30:47.635408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14612714914167442122 len:51915 00:07:24.847 [2024-11-29 09:30:47.635438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.635483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 00:07:24.847 [2024-11-29 09:30:47.635500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.635528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14612714913291487946 len:51915 00:07:24.847 [2024-11-29 09:30:47.635544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.635571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14612714913291487946 len:51915 00:07:24.847 [2024-11-29 09:30:47.635591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:24.847 #29 NEW cov: 11810 ft: 13394 corp: 6/180b lim: 50 exec/s: 0 rss: 69Mb L: 46/46 MS: 5 CrossOver-ChangeBinInt-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:07:24.847 [2024-11-29 09:30:47.685566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:24.847 [2024-11-29 09:30:47.685596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.685652] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:62976 len:1 00:07:24.847 [2024-11-29 09:30:47.685670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:24.847 [2024-11-29 09:30:47.685699] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:24.847 [2024-11-29 09:30:47.685716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.107 #30 NEW cov: 11810 ft: 13476 corp: 7/213b lim: 50 exec/s: 0 rss: 69Mb L: 33/46 MS: 1 ChangeBinInt- 00:07:25.107 [2024-11-29 09:30:47.755806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14612714914167442122 len:51915 00:07:25.107 [2024-11-29 09:30:47.755837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.755868] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 00:07:25.107 [2024-11-29 09:30:47.755885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.755914] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14612714913291487946 len:51915 00:07:25.107 [2024-11-29 09:30:47.755930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.755957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14612714910707796682 len:51915 00:07:25.107 [2024-11-29 09:30:47.755989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.107 #31 NEW cov: 11810 ft: 13556 corp: 8/259b lim: 50 exec/s: 0 rss: 69Mb L: 46/46 MS: 1 ChangeByte- 00:07:25.107 [2024-11-29 09:30:47.825916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:25.107 [2024-11-29 09:30:47.825946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.825991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:25.107 [2024-11-29 09:30:47.826009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.826038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:129 00:07:25.107 [2024-11-29 09:30:47.826054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.107 #32 NEW cov: 11810 ft: 13587 corp: 9/292b lim: 50 exec/s: 0 rss: 69Mb L: 33/46 MS: 1 ChangeBit- 00:07:25.107 [2024-11-29 09:30:47.876047] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.107 [2024-11-29 09:30:47.876077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.876127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.107 [2024-11-29 09:30:47.876145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.876173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:25.107 [2024-11-29 09:30:47.876189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.876216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:25.107 [2024-11-29 09:30:47.876232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.107 #33 NEW cov: 11810 ft: 13611 corp: 10/339b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 CrossOver- 00:07:25.107 [2024-11-29 09:30:47.936238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.107 [2024-11-29 09:30:47.936267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.936313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.107 [2024-11-29 09:30:47.936330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.936358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18387915807871991807 len:256 00:07:25.107 [2024-11-29 09:30:47.936374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.107 [2024-11-29 09:30:47.936401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:25.107 [2024-11-29 09:30:47.936417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.366 #34 NEW cov: 11810 ft: 13642 corp: 11/386b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 ChangeBinInt- 00:07:25.367 [2024-11-29 09:30:47.996393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.367 [2024-11-29 09:30:47.996422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:47.996467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.367 [2024-11-29 09:30:47.996484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:47.996513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:25.367 [2024-11-29 09:30:47.996529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:47.996556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:170 00:07:25.367 [2024-11-29 09:30:47.996571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.367 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:25.367 #35 NEW cov: 11833 ft: 13723 corp: 12/432b lim: 50 exec/s: 0 rss: 69Mb L: 46/47 MS: 1 CopyPart- 00:07:25.367 [2024-11-29 09:30:48.046499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.367 [2024-11-29 09:30:48.046531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.046578] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.367 [2024-11-29 09:30:48.046595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.046634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18387915807871991807 len:256 00:07:25.367 [2024-11-29 09:30:48.046650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.046676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:25.367 [2024-11-29 09:30:48.046692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.367 #36 NEW cov: 11833 ft: 13777 corp: 13/479b lim: 50 exec/s: 0 rss: 69Mb L: 47/47 MS: 1 ChangeByte- 00:07:25.367 [2024-11-29 09:30:48.116716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.367 [2024-11-29 09:30:48.116745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.116790] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.367 [2024-11-29 09:30:48.116807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.116836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:25.367 [2024-11-29 09:30:48.116852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.116878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:2816 00:07:25.367 [2024-11-29 09:30:48.116894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.367 #37 NEW cov: 11833 ft: 13843 corp: 14/527b lim: 50 exec/s: 37 rss: 69Mb L: 48/48 MS: 1 CopyPart- 00:07:25.367 [2024-11-29 09:30:48.166786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:07:25.367 [2024-11-29 09:30:48.166815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.166861] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069414649087 len:65536 00:07:25.367 [2024-11-29 09:30:48.166878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.367 [2024-11-29 09:30:48.166907] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:25.367 [2024-11-29 09:30:48.166923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.367 #38 NEW cov: 11833 ft: 13870 corp: 15/560b lim: 50 exec/s: 38 rss: 69Mb L: 33/48 MS: 1 ChangeBinInt- 00:07:25.626 [2024-11-29 09:30:48.216989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076654 len:65536 00:07:25.626 [2024-11-29 09:30:48.217019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.626 [2024-11-29 09:30:48.217070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.626 [2024-11-29 09:30:48.217088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.626 [2024-11-29 09:30:48.217116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18387915807871991807 len:256 00:07:25.626 [2024-11-29 09:30:48.217131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.626 [2024-11-29 09:30:48.217158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:25.626 [2024-11-29 09:30:48.217175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.627 #39 NEW cov: 11833 ft: 13930 corp: 16/607b lim: 50 exec/s: 39 rss: 69Mb L: 47/48 MS: 1 ChangeBit- 00:07:25.627 [2024-11-29 09:30:48.267142] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:281471473360650 len:65536 00:07:25.627 [2024-11-29 09:30:48.267172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.267219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.627 [2024-11-29 09:30:48.267237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.267266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:25.627 [2024-11-29 09:30:48.267283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.267310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446685571960012799 len:51915 00:07:25.627 [2024-11-29 09:30:48.267326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.627 #40 NEW cov: 11833 ft: 13952 corp: 17/656b lim: 50 exec/s: 40 rss: 69Mb L: 49/49 MS: 1 CrossOver- 00:07:25.627 [2024-11-29 09:30:48.327255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1025 00:07:25.627 [2024-11-29 09:30:48.327284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.327330] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744069414649087 len:65536 00:07:25.627 [2024-11-29 09:30:48.327347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.327376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:07:25.627 [2024-11-29 09:30:48.327392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.627 #41 NEW cov: 11833 ft: 13993 corp: 18/689b lim: 50 exec/s: 41 rss: 70Mb L: 33/49 MS: 1 ChangeBit- 00:07:25.627 [2024-11-29 09:30:48.387352] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3395443640670125359 len:256 00:07:25.627 [2024-11-29 09:30:48.387382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.387432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.627 [2024-11-29 09:30:48.387450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.627 #44 NEW cov: 11833 ft: 14244 corp: 19/710b lim: 50 exec/s: 44 rss: 70Mb L: 21/49 MS: 3 CrossOver-ChangeByte-CrossOver- 00:07:25.627 [2024-11-29 09:30:48.437569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.627 [2024-11-29 09:30:48.437609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.437642] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.627 [2024-11-29 09:30:48.437659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.437687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18387915807871991807 len:256 00:07:25.627 [2024-11-29 09:30:48.437703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.627 [2024-11-29 09:30:48.437730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743017214705664 len:65281 00:07:25.627 [2024-11-29 09:30:48.437745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.886 #45 NEW cov: 11833 ft: 14249 corp: 20/757b lim: 50 exec/s: 45 rss: 70Mb L: 47/49 MS: 1 ChangeBinInt- 00:07:25.886 [2024-11-29 09:30:48.487664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.886 [2024-11-29 09:30:48.487694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.487741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.886 [2024-11-29 09:30:48.487758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.487786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18387915807871991807 len:256 00:07:25.886 [2024-11-29 09:30:48.487802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.487829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743017214705664 len:65281 00:07:25.886 [2024-11-29 09:30:48.487844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.886 #46 NEW cov: 11833 ft: 14277 corp: 21/804b lim: 50 exec/s: 46 rss: 70Mb L: 47/49 MS: 1 ShuffleBytes- 00:07:25.886 [2024-11-29 09:30:48.547865] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.886 [2024-11-29 09:30:48.547894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.547940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.886 [2024-11-29 09:30:48.547957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.547986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18387915807871991807 len:256 00:07:25.886 [2024-11-29 09:30:48.548002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.548029] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:25.886 [2024-11-29 09:30:48.548048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.886 #47 NEW cov: 11833 ft: 14284 corp: 22/853b lim: 50 exec/s: 47 rss: 70Mb L: 49/49 MS: 1 CopyPart- 00:07:25.886 [2024-11-29 09:30:48.598011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.886 [2024-11-29 09:30:48.598042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.598074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.886 [2024-11-29 09:30:48.598092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.598121] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:25.886 [2024-11-29 09:30:48.598138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.598165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18377781609198780415 len:43460 00:07:25.886 [2024-11-29 09:30:48.598181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.886 #48 NEW cov: 11833 ft: 14305 corp: 23/898b lim: 50 exec/s: 48 rss: 70Mb L: 45/49 MS: 1 EraseBytes- 00:07:25.886 [2024-11-29 09:30:48.658138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:25.886 [2024-11-29 09:30:48.658168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.658213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:25.886 [2024-11-29 09:30:48.658230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.658258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:25.886 [2024-11-29 09:30:48.658274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:25.886 [2024-11-29 09:30:48.658301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:25.886 [2024-11-29 09:30:48.658316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:25.886 #49 NEW cov: 11833 ft: 14318 corp: 24/945b lim: 50 exec/s: 49 rss: 70Mb L: 47/49 MS: 1 CrossOver- 00:07:25.886 [2024-11-29 09:30:48.708113] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076654 len:1 00:07:25.886 [2024-11-29 09:30:48.708142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.145 #50 NEW cov: 11833 ft: 14348 corp: 25/960b lim: 50 exec/s: 50 rss: 70Mb L: 15/49 MS: 1 CrossOver- 00:07:26.145 [2024-11-29 09:30:48.779147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:281471473360650 len:65536 00:07:26.145 [2024-11-29 09:30:48.779176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.145 [2024-11-29 09:30:48.779211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.145 [2024-11-29 09:30:48.779227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.145 [2024-11-29 09:30:48.779281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551612 len:65536 00:07:26.145 [2024-11-29 09:30:48.779296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.145 [2024-11-29 09:30:48.779345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446685571960012799 len:51915 00:07:26.145 [2024-11-29 09:30:48.779360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.145 #51 NEW cov: 11833 ft: 14510 corp: 26/1009b lim: 50 exec/s: 51 rss: 70Mb L: 49/49 MS: 1 ChangeBinInt- 00:07:26.145 [2024-11-29 09:30:48.819255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14612714914167442122 len:51915 00:07:26.145 [2024-11-29 09:30:48.819283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.145 [2024-11-29 09:30:48.819316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 00:07:26.145 [2024-11-29 09:30:48.819331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.145 [2024-11-29 09:30:48.819381] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14612714913291487946 len:51915 00:07:26.145 [2024-11-29 09:30:48.819397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.145 [2024-11-29 09:30:48.819448] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14612714913291487946 len:51915 00:07:26.145 [2024-11-29 09:30:48.819463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.145 #52 NEW cov: 11833 ft: 14553 corp: 27/1055b lim: 50 exec/s: 52 rss: 70Mb L: 46/49 MS: 1 CrossOver- 00:07:26.145 [2024-11-29 09:30:48.859351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:26.146 [2024-11-29 09:30:48.859379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.859419] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.146 [2024-11-29 09:30:48.859434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.859483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:26.146 [2024-11-29 09:30:48.859497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.859567] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:792352789584936959 len:49966 00:07:26.146 [2024-11-29 09:30:48.859582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.146 #53 NEW cov: 11833 ft: 14620 corp: 28/1099b lim: 50 exec/s: 53 rss: 70Mb L: 44/49 MS: 1 EraseBytes- 00:07:26.146 [2024-11-29 09:30:48.899510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14612714914167442122 len:51915 00:07:26.146 [2024-11-29 09:30:48.899536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.899573] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 00:07:26.146 [2024-11-29 09:30:48.899591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.899646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:14612714913291487946 len:51915 00:07:26.146 [2024-11-29 09:30:48.899661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.899710] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:14612714913291487946 len:51915 00:07:26.146 [2024-11-29 09:30:48.899725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.146 #54 NEW cov: 11833 ft: 14677 corp: 29/1145b lim: 50 exec/s: 54 rss: 70Mb L: 46/49 MS: 1 ChangeBinInt- 00:07:26.146 [2024-11-29 09:30:48.939585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:26.146 [2024-11-29 09:30:48.939617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.939666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.146 [2024-11-29 09:30:48.939681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.939732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:26.146 [2024-11-29 09:30:48.939748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.939798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18377781609198780415 len:43460 00:07:26.146 [2024-11-29 09:30:48.939813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.146 #55 NEW cov: 11833 ft: 14725 corp: 30/1190b lim: 50 exec/s: 55 rss: 70Mb L: 45/49 MS: 1 ChangeByte- 00:07:26.146 [2024-11-29 09:30:48.979737] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:26.146 [2024-11-29 09:30:48.979765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.979798] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.146 [2024-11-29 09:30:48.979813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.979862] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:16212728864898547711 len:1 00:07:26.146 [2024-11-29 09:30:48.979877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.146 [2024-11-29 09:30:48.979927] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:72057594021412864 len:2816 00:07:26.146 [2024-11-29 09:30:48.979943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.405 #56 NEW cov: 11833 ft: 14744 corp: 31/1238b lim: 50 exec/s: 56 rss: 70Mb L: 48/49 MS: 1 InsertByte- 00:07:26.405 [2024-11-29 09:30:49.019856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:26.405 [2024-11-29 09:30:49.019883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.019926] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.405 [2024-11-29 09:30:49.019941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.019991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446462603027808255 len:256 00:07:26.405 [2024-11-29 09:30:49.020006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.020055] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446743021442564095 len:65281 00:07:26.405 [2024-11-29 09:30:49.020069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.405 #57 NEW cov: 11833 ft: 14806 corp: 32/1285b lim: 50 exec/s: 57 rss: 70Mb L: 47/49 MS: 1 ChangeByte- 00:07:26.405 [2024-11-29 09:30:49.059853] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:69805794224242688 len:1 00:07:26.405 [2024-11-29 09:30:49.059880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.059915] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:07:26.405 [2024-11-29 09:30:49.059930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.059982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:129 00:07:26.405 [2024-11-29 09:30:49.059997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.405 #58 NEW cov: 11833 ft: 14858 corp: 33/1318b lim: 50 exec/s: 58 rss: 70Mb L: 33/49 MS: 1 ChangeBinInt- 00:07:26.405 [2024-11-29 09:30:49.100103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:3395443639187963183 len:256 00:07:26.405 [2024-11-29 09:30:49.100128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.100188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.405 [2024-11-29 09:30:49.100204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.100254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:07:26.405 [2024-11-29 09:30:49.100269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.100320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65291 00:07:26.405 [2024-11-29 09:30:49.100333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.405 #59 NEW cov: 11833 ft: 14865 corp: 34/1367b lim: 50 exec/s: 59 rss: 70Mb L: 49/49 MS: 1 InsertByte- 00:07:26.405 [2024-11-29 09:30:49.140168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:2236600164951076655 len:65536 00:07:26.405 [2024-11-29 09:30:49.140194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.140235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:07:26.405 [2024-11-29 09:30:49.140250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.140303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073695592447 len:65536 00:07:26.405 [2024-11-29 09:30:49.140318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:26.405 [2024-11-29 09:30:49.140387] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18377781609198780415 len:43460 00:07:26.405 [2024-11-29 09:30:49.140403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:26.405 #60 NEW cov: 11833 ft: 14870 corp: 35/1412b lim: 50 exec/s: 30 rss: 70Mb L: 45/49 MS: 1 ChangeByte- 00:07:26.405 #60 DONE cov: 11833 ft: 14870 corp: 35/1412b lim: 50 exec/s: 30 rss: 70Mb 00:07:26.405 ###### Recommended dictionary. ###### 00:07:26.405 "\012\000\000\000\000\000\000\000" # Uses: 0 00:07:26.405 "}\251\303-\336\317\223\000" # Uses: 0 00:07:26.405 ###### End of recommended dictionary. ###### 00:07:26.405 Done 60 runs in 2 second(s) 00:07:26.664 09:30:49 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:07:26.664 09:30:49 -- ../common.sh@72 -- # (( i++ )) 00:07:26.664 09:30:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.664 09:30:49 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:07:26.664 09:30:49 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:07:26.664 09:30:49 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.664 09:30:49 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.664 09:30:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:26.664 09:30:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:07:26.664 09:30:49 -- nvmf/run.sh@29 -- # printf %02d 20 00:07:26.664 09:30:49 -- nvmf/run.sh@29 -- # port=4420 00:07:26.664 09:30:49 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:26.664 09:30:49 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:07:26.664 09:30:49 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.664 09:30:49 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:07:26.664 [2024-11-29 09:30:49.323894] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:26.664 [2024-11-29 09:30:49.323973] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3187972 ] 00:07:26.664 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.924 [2024-11-29 09:30:49.573563] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:26.924 [2024-11-29 09:30:49.664418] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:26.924 [2024-11-29 09:30:49.664560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.924 [2024-11-29 09:30:49.722294] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:26.924 [2024-11-29 09:30:49.738668] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:07:26.924 INFO: Running with entropic power schedule (0xFF, 100). 00:07:26.924 INFO: Seed: 956895284 00:07:27.184 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:27.184 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:27.184 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:07:27.184 INFO: A corpus is not provided, starting from an empty corpus 00:07:27.184 #2 INITED exec/s: 0 rss: 60Mb 00:07:27.184 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:27.184 This may also happen if the target rejected all inputs we tried so far 00:07:27.184 [2024-11-29 09:30:49.783808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.184 [2024-11-29 09:30:49.783837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.184 [2024-11-29 09:30:49.783893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:27.184 [2024-11-29 09:30:49.783907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.443 NEW_FUNC[1/672]: 0x45c9d8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:07:27.443 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.443 #15 NEW cov: 11664 ft: 11665 corp: 2/51b lim: 90 exec/s: 0 rss: 68Mb L: 50/50 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:07:27.443 [2024-11-29 09:30:50.114624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.443 [2024-11-29 09:30:50.114670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.443 #20 NEW cov: 11777 ft: 12994 corp: 3/84b lim: 90 exec/s: 0 rss: 69Mb L: 33/50 MS: 5 ShuffleBytes-ChangeByte-CopyPart-ChangeBit-CrossOver- 00:07:27.443 [2024-11-29 09:30:50.154758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.443 [2024-11-29 09:30:50.154788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.444 [2024-11-29 09:30:50.154848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:27.444 [2024-11-29 09:30:50.154862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.444 #21 NEW cov: 11783 ft: 13192 corp: 4/134b lim: 90 exec/s: 0 rss: 69Mb L: 50/50 MS: 1 ChangeByte- 00:07:27.444 [2024-11-29 09:30:50.194733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.444 [2024-11-29 09:30:50.194760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.444 #22 NEW cov: 11868 ft: 13503 corp: 5/167b lim: 90 exec/s: 0 rss: 69Mb L: 33/50 MS: 1 ChangeBit- 00:07:27.444 [2024-11-29 09:30:50.244880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.444 [2024-11-29 09:30:50.244909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.444 #25 NEW cov: 11868 ft: 13606 corp: 6/186b lim: 90 exec/s: 0 rss: 69Mb L: 19/50 MS: 3 CrossOver-ChangeBinInt-InsertRepeatedBytes- 00:07:27.444 [2024-11-29 09:30:50.284998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.444 [2024-11-29 09:30:50.285025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.703 #26 NEW cov: 11868 ft: 13840 corp: 7/219b lim: 90 exec/s: 0 rss: 69Mb L: 33/50 MS: 1 ChangeBit- 00:07:27.703 [2024-11-29 09:30:50.325106] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.703 [2024-11-29 09:30:50.325134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.703 #27 NEW cov: 11868 ft: 13893 corp: 8/252b lim: 90 exec/s: 0 rss: 69Mb L: 33/50 MS: 1 CopyPart- 00:07:27.703 [2024-11-29 09:30:50.365230] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.703 [2024-11-29 09:30:50.365257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.703 #28 NEW cov: 11868 ft: 13924 corp: 9/276b lim: 90 exec/s: 0 rss: 69Mb L: 24/50 MS: 1 EraseBytes- 00:07:27.703 [2024-11-29 09:30:50.415352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.703 [2024-11-29 09:30:50.415378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.703 #29 NEW cov: 11868 ft: 14020 corp: 10/300b lim: 90 exec/s: 0 rss: 69Mb L: 24/50 MS: 1 CopyPart- 00:07:27.703 [2024-11-29 09:30:50.455483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.703 [2024-11-29 09:30:50.455510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.703 #35 NEW cov: 11868 ft: 14070 corp: 11/325b lim: 90 exec/s: 0 rss: 69Mb L: 25/50 MS: 1 InsertByte- 00:07:27.703 [2024-11-29 09:30:50.495574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.703 [2024-11-29 09:30:50.495604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.703 #36 NEW cov: 11868 ft: 14125 corp: 12/344b lim: 90 exec/s: 0 rss: 69Mb L: 19/50 MS: 1 ChangeBinInt- 00:07:27.963 [2024-11-29 09:30:50.546256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.963 [2024-11-29 09:30:50.546283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.963 [2024-11-29 09:30:50.546330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:27.963 [2024-11-29 09:30:50.546346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.963 [2024-11-29 09:30:50.546402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:27.963 [2024-11-29 09:30:50.546417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:27.963 [2024-11-29 09:30:50.546477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:27.963 [2024-11-29 09:30:50.546492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:27.963 #42 NEW cov: 11868 ft: 14547 corp: 13/431b lim: 90 exec/s: 0 rss: 69Mb L: 87/87 MS: 1 InsertRepeatedBytes- 00:07:27.963 [2024-11-29 09:30:50.595907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.963 [2024-11-29 09:30:50.595935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.963 #43 NEW cov: 11868 ft: 14610 corp: 14/460b lim: 90 exec/s: 0 rss: 70Mb L: 29/87 MS: 1 EraseBytes- 00:07:27.963 [2024-11-29 09:30:50.646204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.963 [2024-11-29 09:30:50.646232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.963 [2024-11-29 09:30:50.646289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:27.963 [2024-11-29 09:30:50.646305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:27.963 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:27.963 #44 NEW cov: 11891 ft: 14626 corp: 15/507b lim: 90 exec/s: 0 rss: 70Mb L: 47/87 MS: 1 CopyPart- 00:07:27.963 [2024-11-29 09:30:50.696179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.963 [2024-11-29 09:30:50.696207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.963 #45 NEW cov: 11891 ft: 14660 corp: 16/531b lim: 90 exec/s: 0 rss: 70Mb L: 24/87 MS: 1 ChangeByte- 00:07:27.963 [2024-11-29 09:30:50.746324] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.963 [2024-11-29 09:30:50.746351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:27.963 #46 NEW cov: 11891 ft: 14701 corp: 17/565b lim: 90 exec/s: 0 rss: 70Mb L: 34/87 MS: 1 InsertByte- 00:07:27.963 [2024-11-29 09:30:50.786478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:27.963 [2024-11-29 09:30:50.786505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.221 #47 NEW cov: 11891 ft: 14712 corp: 18/598b lim: 90 exec/s: 47 rss: 70Mb L: 33/87 MS: 1 CrossOver- 00:07:28.222 [2024-11-29 09:30:50.826892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.222 [2024-11-29 09:30:50.826920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.222 [2024-11-29 09:30:50.826959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.222 [2024-11-29 09:30:50.826976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.222 [2024-11-29 09:30:50.827032] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:28.222 [2024-11-29 09:30:50.827048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.222 #48 NEW cov: 11891 ft: 14988 corp: 19/667b lim: 90 exec/s: 48 rss: 70Mb L: 69/87 MS: 1 InsertRepeatedBytes- 00:07:28.222 [2024-11-29 09:30:50.866702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.222 [2024-11-29 09:30:50.866736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.222 #49 NEW cov: 11891 ft: 15007 corp: 20/701b lim: 90 exec/s: 49 rss: 70Mb L: 34/87 MS: 1 CMP- DE: "\376\377\377\365"- 00:07:28.222 [2024-11-29 09:30:50.906827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.222 [2024-11-29 09:30:50.906854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.222 #50 NEW cov: 11891 ft: 15052 corp: 21/733b lim: 90 exec/s: 50 rss: 70Mb L: 32/87 MS: 1 EraseBytes- 00:07:28.222 [2024-11-29 09:30:50.956941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.222 [2024-11-29 09:30:50.956970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.222 #51 NEW cov: 11891 ft: 15058 corp: 22/766b lim: 90 exec/s: 51 rss: 70Mb L: 33/87 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\002"- 00:07:28.222 [2024-11-29 09:30:50.997202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.222 [2024-11-29 09:30:50.997229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.222 [2024-11-29 09:30:50.997283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.222 [2024-11-29 09:30:50.997297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.222 #52 NEW cov: 11891 ft: 15087 corp: 23/805b lim: 90 exec/s: 52 rss: 70Mb L: 39/87 MS: 1 EraseBytes- 00:07:28.222 [2024-11-29 09:30:51.037201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.222 [2024-11-29 09:30:51.037228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.222 #53 NEW cov: 11891 ft: 15095 corp: 24/838b lim: 90 exec/s: 53 rss: 70Mb L: 33/87 MS: 1 ChangeByte- 00:07:28.481 [2024-11-29 09:30:51.077447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.077474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.077535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.481 [2024-11-29 09:30:51.077551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.481 #54 NEW cov: 11891 ft: 15102 corp: 25/875b lim: 90 exec/s: 54 rss: 70Mb L: 37/87 MS: 1 PersAutoDict- DE: "\376\377\377\365"- 00:07:28.481 [2024-11-29 09:30:51.117582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.117615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.117658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.481 [2024-11-29 09:30:51.117674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.481 #55 NEW cov: 11891 ft: 15146 corp: 26/913b lim: 90 exec/s: 55 rss: 70Mb L: 38/87 MS: 1 CrossOver- 00:07:28.481 [2024-11-29 09:30:51.158231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.158259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.158325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.481 [2024-11-29 09:30:51.158342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.158398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:28.481 [2024-11-29 09:30:51.158413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.158469] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:28.481 [2024-11-29 09:30:51.158484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.158544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:07:28.481 [2024-11-29 09:30:51.158559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:28.481 #56 NEW cov: 11891 ft: 15251 corp: 27/1003b lim: 90 exec/s: 56 rss: 70Mb L: 90/90 MS: 1 InsertRepeatedBytes- 00:07:28.481 [2024-11-29 09:30:51.197663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.197691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 #57 NEW cov: 11891 ft: 15265 corp: 28/1036b lim: 90 exec/s: 57 rss: 70Mb L: 33/90 MS: 1 PersAutoDict- DE: "\376\377\377\365"- 00:07:28.481 [2024-11-29 09:30:51.237923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.237950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.238003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.481 [2024-11-29 09:30:51.238019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.481 #58 NEW cov: 11891 ft: 15274 corp: 29/1074b lim: 90 exec/s: 58 rss: 70Mb L: 38/90 MS: 1 ChangeByte- 00:07:28.481 [2024-11-29 09:30:51.278065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.278093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.278135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.481 [2024-11-29 09:30:51.278150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.481 #59 NEW cov: 11891 ft: 15283 corp: 30/1110b lim: 90 exec/s: 59 rss: 70Mb L: 36/90 MS: 1 EraseBytes- 00:07:28.481 [2024-11-29 09:30:51.318533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.481 [2024-11-29 09:30:51.318562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.318613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.481 [2024-11-29 09:30:51.318629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.318687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:28.481 [2024-11-29 09:30:51.318702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.481 [2024-11-29 09:30:51.318761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:28.481 [2024-11-29 09:30:51.318777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.740 #60 NEW cov: 11891 ft: 15311 corp: 31/1185b lim: 90 exec/s: 60 rss: 70Mb L: 75/90 MS: 1 InsertRepeatedBytes- 00:07:28.741 [2024-11-29 09:30:51.358282] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.741 [2024-11-29 09:30:51.358308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.358358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.741 [2024-11-29 09:30:51.358373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.741 #61 NEW cov: 11891 ft: 15341 corp: 32/1222b lim: 90 exec/s: 61 rss: 70Mb L: 37/90 MS: 1 InsertByte- 00:07:28.741 [2024-11-29 09:30:51.398315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.741 [2024-11-29 09:30:51.398342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.741 #62 NEW cov: 11891 ft: 15377 corp: 33/1255b lim: 90 exec/s: 62 rss: 70Mb L: 33/90 MS: 1 ChangeByte- 00:07:28.741 [2024-11-29 09:30:51.438905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.741 [2024-11-29 09:30:51.438932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.438982] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.741 [2024-11-29 09:30:51.438998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.439052] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:28.741 [2024-11-29 09:30:51.439084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.439145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:28.741 [2024-11-29 09:30:51.439160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.478999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.741 [2024-11-29 09:30:51.479026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.479071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:28.741 [2024-11-29 09:30:51.479088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.479145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:28.741 [2024-11-29 09:30:51.479160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:28.741 [2024-11-29 09:30:51.479218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:28.741 [2024-11-29 09:30:51.479233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:28.741 #64 NEW cov: 11891 ft: 15401 corp: 34/1342b lim: 90 exec/s: 64 rss: 70Mb L: 87/90 MS: 2 CrossOver-CMP- DE: "\000\000\002\000"- 00:07:28.741 [2024-11-29 09:30:51.518611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.741 [2024-11-29 09:30:51.518639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.741 #65 NEW cov: 11891 ft: 15411 corp: 35/1375b lim: 90 exec/s: 65 rss: 70Mb L: 33/90 MS: 1 InsertByte- 00:07:28.741 [2024-11-29 09:30:51.558785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:28.741 [2024-11-29 09:30:51.558812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:28.741 #66 NEW cov: 11891 ft: 15417 corp: 36/1399b lim: 90 exec/s: 66 rss: 70Mb L: 24/90 MS: 1 ChangeBit- 00:07:29.000 [2024-11-29 09:30:51.598865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.000 [2024-11-29 09:30:51.598893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.000 #67 NEW cov: 11891 ft: 15437 corp: 37/1432b lim: 90 exec/s: 67 rss: 70Mb L: 33/90 MS: 1 ChangeByte- 00:07:29.000 [2024-11-29 09:30:51.639229] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.000 [2024-11-29 09:30:51.639256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.000 [2024-11-29 09:30:51.639326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:29.000 [2024-11-29 09:30:51.639343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.000 #68 NEW cov: 11891 ft: 15450 corp: 38/1482b lim: 90 exec/s: 68 rss: 70Mb L: 50/90 MS: 1 ShuffleBytes- 00:07:29.000 [2024-11-29 09:30:51.679144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.000 [2024-11-29 09:30:51.679174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.000 #69 NEW cov: 11891 ft: 15519 corp: 39/1517b lim: 90 exec/s: 69 rss: 70Mb L: 35/90 MS: 1 CopyPart- 00:07:29.000 [2024-11-29 09:30:51.729766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.000 [2024-11-29 09:30:51.729794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.000 [2024-11-29 09:30:51.729836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:29.001 [2024-11-29 09:30:51.729850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.001 [2024-11-29 09:30:51.729908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:29.001 [2024-11-29 09:30:51.729939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.001 [2024-11-29 09:30:51.729999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:07:29.001 [2024-11-29 09:30:51.730013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:29.001 #70 NEW cov: 11891 ft: 15524 corp: 40/1604b lim: 90 exec/s: 70 rss: 70Mb L: 87/90 MS: 1 PersAutoDict- DE: "\000\000\002\000"- 00:07:29.001 [2024-11-29 09:30:51.769726] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:07:29.001 [2024-11-29 09:30:51.769754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.001 [2024-11-29 09:30:51.769808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:07:29.001 [2024-11-29 09:30:51.769824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:29.001 [2024-11-29 09:30:51.769882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:07:29.001 [2024-11-29 09:30:51.769898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:29.001 #71 NEW cov: 11891 ft: 15525 corp: 41/1668b lim: 90 exec/s: 35 rss: 70Mb L: 64/90 MS: 1 InsertRepeatedBytes- 00:07:29.001 #71 DONE cov: 11891 ft: 15525 corp: 41/1668b lim: 90 exec/s: 35 rss: 70Mb 00:07:29.001 ###### Recommended dictionary. ###### 00:07:29.001 "\376\377\377\365" # Uses: 2 00:07:29.001 "\000\000\000\000\000\000\000\002" # Uses: 0 00:07:29.001 "\000\000\002\000" # Uses: 1 00:07:29.001 ###### End of recommended dictionary. ###### 00:07:29.001 Done 71 runs in 2 second(s) 00:07:29.260 09:30:51 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:07:29.260 09:30:51 -- ../common.sh@72 -- # (( i++ )) 00:07:29.260 09:30:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:29.260 09:30:51 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:07:29.260 09:30:51 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:07:29.260 09:30:51 -- nvmf/run.sh@24 -- # local timen=1 00:07:29.260 09:30:51 -- nvmf/run.sh@25 -- # local core=0x1 00:07:29.260 09:30:51 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:29.260 09:30:51 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:07:29.260 09:30:51 -- nvmf/run.sh@29 -- # printf %02d 21 00:07:29.260 09:30:51 -- nvmf/run.sh@29 -- # port=4421 00:07:29.260 09:30:51 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:29.260 09:30:51 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:07:29.260 09:30:51 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:29.260 09:30:51 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:07:29.260 [2024-11-29 09:30:51.955574] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:29.260 [2024-11-29 09:30:51.955669] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3188467 ] 00:07:29.260 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.520 [2024-11-29 09:30:52.133999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.520 [2024-11-29 09:30:52.197880] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.520 [2024-11-29 09:30:52.198006] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.520 [2024-11-29 09:30:52.255692] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.520 [2024-11-29 09:30:52.272060] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:07:29.520 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.520 INFO: Seed: 3490904195 00:07:29.520 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:29.520 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:29.520 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:07:29.520 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.520 #2 INITED exec/s: 0 rss: 60Mb 00:07:29.520 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:29.520 This may also happen if the target rejected all inputs we tried so far 00:07:29.520 [2024-11-29 09:30:52.317112] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:29.520 [2024-11-29 09:30:52.317142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:29.779 NEW_FUNC[1/671]: 0x45fc08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:07:29.779 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:29.779 #8 NEW cov: 11634 ft: 11635 corp: 2/11b lim: 50 exec/s: 0 rss: 68Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:30.038 [2024-11-29 09:30:52.637807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.637838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 NEW_FUNC[1/1]: 0x1546c88 in nvme_ctrlr_get_ready_timeout /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ctrlr.c:1211 00:07:30.038 #11 NEW cov: 11752 ft: 12028 corp: 3/28b lim: 50 exec/s: 0 rss: 68Mb L: 17/17 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:07:30.038 [2024-11-29 09:30:52.677836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.677864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 #12 NEW cov: 11758 ft: 12217 corp: 4/38b lim: 50 exec/s: 0 rss: 68Mb L: 10/17 MS: 1 ChangeByte- 00:07:30.038 [2024-11-29 09:30:52.717962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.717989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 #13 NEW cov: 11843 ft: 12490 corp: 5/48b lim: 50 exec/s: 0 rss: 68Mb L: 10/17 MS: 1 ChangeBinInt- 00:07:30.038 [2024-11-29 09:30:52.758103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.758129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 #14 NEW cov: 11843 ft: 12581 corp: 6/59b lim: 50 exec/s: 0 rss: 68Mb L: 11/17 MS: 1 InsertByte- 00:07:30.038 [2024-11-29 09:30:52.798188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.798214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 #15 NEW cov: 11843 ft: 12646 corp: 7/69b lim: 50 exec/s: 0 rss: 68Mb L: 10/17 MS: 1 ChangeByte- 00:07:30.038 [2024-11-29 09:30:52.838326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.838352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 #16 NEW cov: 11843 ft: 12710 corp: 8/79b lim: 50 exec/s: 0 rss: 68Mb L: 10/17 MS: 1 ShuffleBytes- 00:07:30.038 [2024-11-29 09:30:52.878574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.038 [2024-11-29 09:30:52.878604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.038 [2024-11-29 09:30:52.878655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:30.038 [2024-11-29 09:30:52.878671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.297 #18 NEW cov: 11843 ft: 13646 corp: 9/107b lim: 50 exec/s: 0 rss: 68Mb L: 28/28 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:30.297 [2024-11-29 09:30:52.918473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:52.918500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.297 #20 NEW cov: 11843 ft: 13659 corp: 10/121b lim: 50 exec/s: 0 rss: 68Mb L: 14/28 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:30.297 [2024-11-29 09:30:52.948647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:52.948674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.297 #21 NEW cov: 11843 ft: 13719 corp: 11/131b lim: 50 exec/s: 0 rss: 68Mb L: 10/28 MS: 1 ShuffleBytes- 00:07:30.297 [2024-11-29 09:30:52.978790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:52.978817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.297 [2024-11-29 09:30:52.978860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:30.297 [2024-11-29 09:30:52.978874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.297 #23 NEW cov: 11843 ft: 13775 corp: 12/153b lim: 50 exec/s: 0 rss: 68Mb L: 22/28 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:30.297 [2024-11-29 09:30:53.018821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:53.018846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.297 #24 NEW cov: 11843 ft: 13802 corp: 13/164b lim: 50 exec/s: 0 rss: 68Mb L: 11/28 MS: 1 CopyPart- 00:07:30.297 [2024-11-29 09:30:53.058952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:53.058979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.297 #25 NEW cov: 11843 ft: 13815 corp: 14/178b lim: 50 exec/s: 0 rss: 68Mb L: 14/28 MS: 1 InsertRepeatedBytes- 00:07:30.297 [2024-11-29 09:30:53.089029] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:53.089056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.297 #31 NEW cov: 11843 ft: 13845 corp: 15/188b lim: 50 exec/s: 0 rss: 69Mb L: 10/28 MS: 1 ChangeBinInt- 00:07:30.297 [2024-11-29 09:30:53.129181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.297 [2024-11-29 09:30:53.129210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 #32 NEW cov: 11843 ft: 13853 corp: 16/205b lim: 50 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 ChangeByte- 00:07:30.556 [2024-11-29 09:30:53.169696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.556 [2024-11-29 09:30:53.169723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 [2024-11-29 09:30:53.169760] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:30.556 [2024-11-29 09:30:53.169774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.556 [2024-11-29 09:30:53.169824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:30.556 [2024-11-29 09:30:53.169839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.556 [2024-11-29 09:30:53.169906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:30.556 [2024-11-29 09:30:53.169921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:30.556 #33 NEW cov: 11843 ft: 14241 corp: 17/254b lim: 50 exec/s: 0 rss: 69Mb L: 49/49 MS: 1 InsertRepeatedBytes- 00:07:30.556 [2024-11-29 09:30:53.209389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.556 [2024-11-29 09:30:53.209415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:30.556 #34 NEW cov: 11866 ft: 14387 corp: 18/264b lim: 50 exec/s: 0 rss: 69Mb L: 10/49 MS: 1 ChangeBit- 00:07:30.556 [2024-11-29 09:30:53.249524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.556 [2024-11-29 09:30:53.249550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 #35 NEW cov: 11866 ft: 14407 corp: 19/275b lim: 50 exec/s: 0 rss: 69Mb L: 11/49 MS: 1 ChangeByte- 00:07:30.556 [2024-11-29 09:30:53.289628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.556 [2024-11-29 09:30:53.289654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 #36 NEW cov: 11866 ft: 14424 corp: 20/285b lim: 50 exec/s: 36 rss: 69Mb L: 10/49 MS: 1 ChangeByte- 00:07:30.556 [2024-11-29 09:30:53.329743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.556 [2024-11-29 09:30:53.329770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 #37 NEW cov: 11866 ft: 14432 corp: 21/299b lim: 50 exec/s: 37 rss: 69Mb L: 14/49 MS: 1 ChangeBinInt- 00:07:30.556 [2024-11-29 09:30:53.369885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.556 [2024-11-29 09:30:53.369913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.556 #38 NEW cov: 11866 ft: 14445 corp: 22/316b lim: 50 exec/s: 38 rss: 69Mb L: 17/49 MS: 1 ChangeBinInt- 00:07:30.815 [2024-11-29 09:30:53.410293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.815 [2024-11-29 09:30:53.410322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.815 [2024-11-29 09:30:53.410370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:30.815 [2024-11-29 09:30:53.410387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.815 [2024-11-29 09:30:53.410440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:30.815 [2024-11-29 09:30:53.410454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:30.815 #39 NEW cov: 11866 ft: 14699 corp: 23/348b lim: 50 exec/s: 39 rss: 69Mb L: 32/49 MS: 1 InsertRepeatedBytes- 00:07:30.815 [2024-11-29 09:30:53.450080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.815 [2024-11-29 09:30:53.450107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.815 #40 NEW cov: 11866 ft: 14709 corp: 24/366b lim: 50 exec/s: 40 rss: 69Mb L: 18/49 MS: 1 InsertByte- 00:07:30.815 [2024-11-29 09:30:53.480337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.815 [2024-11-29 09:30:53.480364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.815 [2024-11-29 09:30:53.480420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:30.815 [2024-11-29 09:30:53.480437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:30.815 #41 NEW cov: 11866 ft: 14765 corp: 25/386b lim: 50 exec/s: 41 rss: 69Mb L: 20/49 MS: 1 InsertRepeatedBytes- 00:07:30.815 [2024-11-29 09:30:53.520304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.815 [2024-11-29 09:30:53.520331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.815 #42 NEW cov: 11866 ft: 14826 corp: 26/400b lim: 50 exec/s: 42 rss: 69Mb L: 14/49 MS: 1 ChangeBit- 00:07:30.816 [2024-11-29 09:30:53.560417] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.816 [2024-11-29 09:30:53.560445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.816 [2024-11-29 09:30:53.600544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.816 [2024-11-29 09:30:53.600570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:30.816 #44 NEW cov: 11866 ft: 14842 corp: 27/412b lim: 50 exec/s: 44 rss: 69Mb L: 12/49 MS: 2 EraseBytes-EraseBytes- 00:07:30.816 [2024-11-29 09:30:53.640625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:30.816 [2024-11-29 09:30:53.640651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.073 #45 NEW cov: 11866 ft: 14866 corp: 28/427b lim: 50 exec/s: 45 rss: 69Mb L: 15/49 MS: 1 CrossOver- 00:07:31.073 [2024-11-29 09:30:53.671012] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.073 [2024-11-29 09:30:53.671039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 [2024-11-29 09:30:53.671081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.074 [2024-11-29 09:30:53.671096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.074 [2024-11-29 09:30:53.671148] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:31.074 [2024-11-29 09:30:53.671164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.074 #46 NEW cov: 11866 ft: 14990 corp: 29/462b lim: 50 exec/s: 46 rss: 69Mb L: 35/49 MS: 1 InsertRepeatedBytes- 00:07:31.074 [2024-11-29 09:30:53.710979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.074 [2024-11-29 09:30:53.711005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 [2024-11-29 09:30:53.711047] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.074 [2024-11-29 09:30:53.711062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.074 #47 NEW cov: 11866 ft: 14992 corp: 30/482b lim: 50 exec/s: 47 rss: 69Mb L: 20/49 MS: 1 EraseBytes- 00:07:31.074 [2024-11-29 09:30:53.750951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.074 [2024-11-29 09:30:53.750977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 #48 NEW cov: 11866 ft: 15021 corp: 31/496b lim: 50 exec/s: 48 rss: 70Mb L: 14/49 MS: 1 ShuffleBytes- 00:07:31.074 [2024-11-29 09:30:53.791071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.074 [2024-11-29 09:30:53.791097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 #49 NEW cov: 11866 ft: 15037 corp: 32/511b lim: 50 exec/s: 49 rss: 70Mb L: 15/49 MS: 1 ShuffleBytes- 00:07:31.074 [2024-11-29 09:30:53.831186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.074 [2024-11-29 09:30:53.831213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 #50 NEW cov: 11866 ft: 15053 corp: 33/521b lim: 50 exec/s: 50 rss: 70Mb L: 10/49 MS: 1 CopyPart- 00:07:31.074 [2024-11-29 09:30:53.861260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.074 [2024-11-29 09:30:53.861287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 #51 NEW cov: 11866 ft: 15062 corp: 34/535b lim: 50 exec/s: 51 rss: 70Mb L: 14/49 MS: 1 ShuffleBytes- 00:07:31.074 [2024-11-29 09:30:53.901783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.074 [2024-11-29 09:30:53.901808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.074 [2024-11-29 09:30:53.901851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.074 [2024-11-29 09:30:53.901866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.074 [2024-11-29 09:30:53.901917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:31.074 [2024-11-29 09:30:53.901931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.074 [2024-11-29 09:30:53.901983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:31.074 [2024-11-29 09:30:53.901997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.332 #52 NEW cov: 11866 ft: 15069 corp: 35/579b lim: 50 exec/s: 52 rss: 70Mb L: 44/49 MS: 1 InsertRepeatedBytes- 00:07:31.332 [2024-11-29 09:30:53.941903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.332 [2024-11-29 09:30:53.941930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:53.941967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.332 [2024-11-29 09:30:53.941980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:53.942035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:31.332 [2024-11-29 09:30:53.942049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:53.942102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:31.332 [2024-11-29 09:30:53.942117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.332 #53 NEW cov: 11866 ft: 15076 corp: 36/628b lim: 50 exec/s: 53 rss: 70Mb L: 49/49 MS: 1 ChangeBinInt- 00:07:31.332 [2024-11-29 09:30:53.981777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.332 [2024-11-29 09:30:53.981803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:53.981856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.332 [2024-11-29 09:30:53.981871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.332 #54 NEW cov: 11866 ft: 15080 corp: 37/650b lim: 50 exec/s: 54 rss: 70Mb L: 22/49 MS: 1 CrossOver- 00:07:31.332 [2024-11-29 09:30:54.021748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.332 [2024-11-29 09:30:54.021774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.332 #55 NEW cov: 11866 ft: 15101 corp: 38/660b lim: 50 exec/s: 55 rss: 70Mb L: 10/49 MS: 1 ChangeByte- 00:07:31.332 [2024-11-29 09:30:54.061848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.332 [2024-11-29 09:30:54.061874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.332 #56 NEW cov: 11866 ft: 15120 corp: 39/671b lim: 50 exec/s: 56 rss: 70Mb L: 11/49 MS: 1 InsertByte- 00:07:31.332 [2024-11-29 09:30:54.102385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.332 [2024-11-29 09:30:54.102411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:54.102447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.332 [2024-11-29 09:30:54.102462] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:54.102516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:31.332 [2024-11-29 09:30:54.102531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.332 [2024-11-29 09:30:54.102586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:07:31.332 [2024-11-29 09:30:54.102604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:31.332 #62 NEW cov: 11866 ft: 15121 corp: 40/715b lim: 50 exec/s: 62 rss: 70Mb L: 44/49 MS: 1 CopyPart- 00:07:31.332 [2024-11-29 09:30:54.142097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.332 [2024-11-29 09:30:54.142124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.332 #63 NEW cov: 11866 ft: 15126 corp: 41/730b lim: 50 exec/s: 63 rss: 70Mb L: 15/49 MS: 1 ChangeBit- 00:07:31.591 [2024-11-29 09:30:54.182489] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.591 [2024-11-29 09:30:54.182519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.591 [2024-11-29 09:30:54.182572] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:07:31.591 [2024-11-29 09:30:54.182586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:31.591 [2024-11-29 09:30:54.182643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:07:31.591 [2024-11-29 09:30:54.182659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:31.591 #64 NEW cov: 11866 ft: 15134 corp: 42/766b lim: 50 exec/s: 64 rss: 70Mb L: 36/49 MS: 1 CrossOver- 00:07:31.591 [2024-11-29 09:30:54.222333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.591 [2024-11-29 09:30:54.222360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.591 #65 NEW cov: 11866 ft: 15206 corp: 43/776b lim: 50 exec/s: 65 rss: 70Mb L: 10/49 MS: 1 ChangeBit- 00:07:31.591 [2024-11-29 09:30:54.252395] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.591 [2024-11-29 09:30:54.252420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.591 #66 NEW cov: 11866 ft: 15210 corp: 44/791b lim: 50 exec/s: 66 rss: 70Mb L: 15/49 MS: 1 ChangeByte- 00:07:31.591 [2024-11-29 09:30:54.292526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:07:31.591 [2024-11-29 09:30:54.292553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:31.591 #67 NEW cov: 11866 ft: 15214 corp: 45/804b lim: 50 exec/s: 33 rss: 70Mb L: 13/49 MS: 1 InsertByte- 00:07:31.591 #67 DONE cov: 11866 ft: 15214 corp: 45/804b lim: 50 exec/s: 33 rss: 70Mb 00:07:31.591 Done 67 runs in 2 second(s) 00:07:31.849 09:30:54 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:07:31.849 09:30:54 -- ../common.sh@72 -- # (( i++ )) 00:07:31.849 09:30:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:31.849 09:30:54 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:07:31.849 09:30:54 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:07:31.849 09:30:54 -- nvmf/run.sh@24 -- # local timen=1 00:07:31.849 09:30:54 -- nvmf/run.sh@25 -- # local core=0x1 00:07:31.849 09:30:54 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:31.849 09:30:54 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:07:31.849 09:30:54 -- nvmf/run.sh@29 -- # printf %02d 22 00:07:31.849 09:30:54 -- nvmf/run.sh@29 -- # port=4422 00:07:31.849 09:30:54 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:31.849 09:30:54 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:07:31.849 09:30:54 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:31.850 09:30:54 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:07:31.850 [2024-11-29 09:30:54.485427] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:31.850 [2024-11-29 09:30:54.485520] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3188805 ] 00:07:31.850 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.850 [2024-11-29 09:30:54.663351] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.108 [2024-11-29 09:30:54.731702] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.108 [2024-11-29 09:30:54.731841] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.108 [2024-11-29 09:30:54.789629] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.108 [2024-11-29 09:30:54.805992] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:07:32.108 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.108 INFO: Seed: 1727950038 00:07:32.108 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:32.108 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:32.108 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:07:32.108 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.108 #2 INITED exec/s: 0 rss: 60Mb 00:07:32.108 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.108 This may also happen if the target rejected all inputs we tried so far 00:07:32.108 [2024-11-29 09:30:54.854500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.108 [2024-11-29 09:30:54.854535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.108 [2024-11-29 09:30:54.854586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.108 [2024-11-29 09:30:54.854613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.108 [2024-11-29 09:30:54.854644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:32.108 [2024-11-29 09:30:54.854660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.367 NEW_FUNC[1/672]: 0x461ed8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:07:32.367 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:32.367 #4 NEW cov: 11660 ft: 11661 corp: 2/60b lim: 85 exec/s: 0 rss: 68Mb L: 59/59 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:32.367 [2024-11-29 09:30:55.175249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.367 [2024-11-29 09:30:55.175285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.367 [2024-11-29 09:30:55.175334] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.367 [2024-11-29 09:30:55.175352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.367 [2024-11-29 09:30:55.175381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:32.367 [2024-11-29 09:30:55.175397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.631 #5 NEW cov: 11778 ft: 12165 corp: 3/119b lim: 85 exec/s: 0 rss: 69Mb L: 59/59 MS: 1 CopyPart- 00:07:32.631 [2024-11-29 09:30:55.245280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.631 [2024-11-29 09:30:55.245309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.631 [2024-11-29 09:30:55.245357] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.631 [2024-11-29 09:30:55.245374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.631 #6 NEW cov: 11784 ft: 12773 corp: 4/167b lim: 85 exec/s: 0 rss: 69Mb L: 48/59 MS: 1 EraseBytes- 00:07:32.631 [2024-11-29 09:30:55.305432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.631 [2024-11-29 09:30:55.305461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.631 [2024-11-29 09:30:55.305509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.631 [2024-11-29 09:30:55.305527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.631 #12 NEW cov: 11869 ft: 13084 corp: 5/217b lim: 85 exec/s: 0 rss: 69Mb L: 50/59 MS: 1 InsertRepeatedBytes- 00:07:32.631 [2024-11-29 09:30:55.365561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.631 [2024-11-29 09:30:55.365590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.631 [2024-11-29 09:30:55.365645] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.631 [2024-11-29 09:30:55.365663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.631 #13 NEW cov: 11869 ft: 13200 corp: 6/265b lim: 85 exec/s: 0 rss: 69Mb L: 48/59 MS: 1 ChangeBinInt- 00:07:32.631 [2024-11-29 09:30:55.435830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.631 [2024-11-29 09:30:55.435862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.631 [2024-11-29 09:30:55.435897] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.631 [2024-11-29 09:30:55.435915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.631 #16 NEW cov: 11869 ft: 13301 corp: 7/306b lim: 85 exec/s: 0 rss: 69Mb L: 41/59 MS: 3 ChangeByte-CrossOver-InsertRepeatedBytes- 00:07:32.888 [2024-11-29 09:30:55.485890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.888 [2024-11-29 09:30:55.485918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.888 [2024-11-29 09:30:55.485966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.888 [2024-11-29 09:30:55.485983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.888 #17 NEW cov: 11869 ft: 13356 corp: 8/347b lim: 85 exec/s: 0 rss: 69Mb L: 41/59 MS: 1 ChangeByte- 00:07:32.889 [2024-11-29 09:30:55.556104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.889 [2024-11-29 09:30:55.556132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.556180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.889 [2024-11-29 09:30:55.556198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.889 #18 NEW cov: 11869 ft: 13396 corp: 9/390b lim: 85 exec/s: 0 rss: 69Mb L: 43/59 MS: 1 CMP- DE: "\017\000"- 00:07:32.889 [2024-11-29 09:30:55.606309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.889 [2024-11-29 09:30:55.606338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.606385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.889 [2024-11-29 09:30:55.606402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.606435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:32.889 [2024-11-29 09:30:55.606451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.606479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:32.889 [2024-11-29 09:30:55.606494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.889 #19 NEW cov: 11869 ft: 13787 corp: 10/464b lim: 85 exec/s: 0 rss: 69Mb L: 74/74 MS: 1 CopyPart- 00:07:32.889 [2024-11-29 09:30:55.666493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:32.889 [2024-11-29 09:30:55.666522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.666570] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:32.889 [2024-11-29 09:30:55.666588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.666624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:32.889 [2024-11-29 09:30:55.666641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:32.889 [2024-11-29 09:30:55.666670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:32.889 [2024-11-29 09:30:55.666686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:32.889 #20 NEW cov: 11869 ft: 13927 corp: 11/538b lim: 85 exec/s: 0 rss: 69Mb L: 74/74 MS: 1 ShuffleBytes- 00:07:33.147 [2024-11-29 09:30:55.736619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.147 [2024-11-29 09:30:55.736651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.736700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.147 [2024-11-29 09:30:55.736717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.147 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:33.147 #21 NEW cov: 11886 ft: 13975 corp: 12/579b lim: 85 exec/s: 0 rss: 69Mb L: 41/74 MS: 1 PersAutoDict- DE: "\017\000"- 00:07:33.147 [2024-11-29 09:30:55.786800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.147 [2024-11-29 09:30:55.786829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.786877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.147 [2024-11-29 09:30:55.786894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.786923] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:33.147 [2024-11-29 09:30:55.786940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.786968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:33.147 [2024-11-29 09:30:55.786984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.147 #22 NEW cov: 11886 ft: 14025 corp: 13/654b lim: 85 exec/s: 22 rss: 69Mb L: 75/75 MS: 1 InsertByte- 00:07:33.147 [2024-11-29 09:30:55.856901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.147 [2024-11-29 09:30:55.856930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.856977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.147 [2024-11-29 09:30:55.856995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.147 #23 NEW cov: 11886 ft: 14043 corp: 14/695b lim: 85 exec/s: 23 rss: 69Mb L: 41/75 MS: 1 ChangeBit- 00:07:33.147 [2024-11-29 09:30:55.907246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.147 [2024-11-29 09:30:55.907277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.907310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.147 [2024-11-29 09:30:55.907327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.147 #24 NEW cov: 11886 ft: 14073 corp: 15/745b lim: 85 exec/s: 24 rss: 69Mb L: 50/75 MS: 1 PersAutoDict- DE: "\017\000"- 00:07:33.147 [2024-11-29 09:30:55.957246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.147 [2024-11-29 09:30:55.957274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.957321] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.147 [2024-11-29 09:30:55.957338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.957367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:33.147 [2024-11-29 09:30:55.957383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.147 [2024-11-29 09:30:55.957411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:33.147 [2024-11-29 09:30:55.957426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.405 #25 NEW cov: 11886 ft: 14107 corp: 16/828b lim: 85 exec/s: 25 rss: 69Mb L: 83/83 MS: 1 InsertRepeatedBytes- 00:07:33.405 [2024-11-29 09:30:56.027286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.405 [2024-11-29 09:30:56.027315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.405 #26 NEW cov: 11886 ft: 14914 corp: 17/846b lim: 85 exec/s: 26 rss: 70Mb L: 18/83 MS: 1 CrossOver- 00:07:33.405 [2024-11-29 09:30:56.087565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.405 [2024-11-29 09:30:56.087595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.087651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.405 [2024-11-29 09:30:56.087668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.087698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:33.405 [2024-11-29 09:30:56.087714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.087742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:33.405 [2024-11-29 09:30:56.087761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.405 #27 NEW cov: 11886 ft: 14941 corp: 18/929b lim: 85 exec/s: 27 rss: 70Mb L: 83/83 MS: 1 ChangeBit- 00:07:33.405 [2024-11-29 09:30:56.147666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.405 [2024-11-29 09:30:56.147696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.147745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.405 [2024-11-29 09:30:56.147762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.405 #28 NEW cov: 11886 ft: 14954 corp: 19/974b lim: 85 exec/s: 28 rss: 70Mb L: 45/83 MS: 1 CopyPart- 00:07:33.405 [2024-11-29 09:30:56.217960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.405 [2024-11-29 09:30:56.217989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.218036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.405 [2024-11-29 09:30:56.218053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.218082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:33.405 [2024-11-29 09:30:56.218098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.405 [2024-11-29 09:30:56.218125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:33.405 [2024-11-29 09:30:56.218141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:33.662 #29 NEW cov: 11886 ft: 14960 corp: 20/1052b lim: 85 exec/s: 29 rss: 70Mb L: 78/83 MS: 1 InsertRepeatedBytes- 00:07:33.662 [2024-11-29 09:30:56.288086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.662 [2024-11-29 09:30:56.288115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.662 [2024-11-29 09:30:56.288162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.662 [2024-11-29 09:30:56.288179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.662 [2024-11-29 09:30:56.288208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:33.662 [2024-11-29 09:30:56.288224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:33.662 #30 NEW cov: 11886 ft: 14968 corp: 21/1103b lim: 85 exec/s: 30 rss: 70Mb L: 51/83 MS: 1 InsertByte- 00:07:33.662 [2024-11-29 09:30:56.338129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.662 [2024-11-29 09:30:56.338159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.662 [2024-11-29 09:30:56.338207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.662 [2024-11-29 09:30:56.338224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.662 #36 NEW cov: 11886 ft: 14991 corp: 22/1144b lim: 85 exec/s: 36 rss: 70Mb L: 41/83 MS: 1 ShuffleBytes- 00:07:33.662 [2024-11-29 09:30:56.398267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.662 [2024-11-29 09:30:56.398317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.663 #37 NEW cov: 11886 ft: 15027 corp: 23/1162b lim: 85 exec/s: 37 rss: 70Mb L: 18/83 MS: 1 ChangeBinInt- 00:07:33.663 [2024-11-29 09:30:56.468524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.663 [2024-11-29 09:30:56.468554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.663 [2024-11-29 09:30:56.468612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.663 [2024-11-29 09:30:56.468630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.920 #43 NEW cov: 11886 ft: 15028 corp: 24/1205b lim: 85 exec/s: 43 rss: 70Mb L: 43/83 MS: 1 PersAutoDict- DE: "\017\000"- 00:07:33.920 [2024-11-29 09:30:56.538688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.920 [2024-11-29 09:30:56.538717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.920 [2024-11-29 09:30:56.538765] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.920 [2024-11-29 09:30:56.538783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.920 #44 NEW cov: 11886 ft: 15039 corp: 25/1250b lim: 85 exec/s: 44 rss: 70Mb L: 45/83 MS: 1 ChangeBit- 00:07:33.920 [2024-11-29 09:30:56.608845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.920 [2024-11-29 09:30:56.608875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.920 #45 NEW cov: 11886 ft: 15068 corp: 26/1268b lim: 85 exec/s: 45 rss: 70Mb L: 18/83 MS: 1 ChangeBit- 00:07:33.920 [2024-11-29 09:30:56.679050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.920 [2024-11-29 09:30:56.679079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.920 [2024-11-29 09:30:56.679127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.920 [2024-11-29 09:30:56.679144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:33.920 #46 NEW cov: 11886 ft: 15079 corp: 27/1311b lim: 85 exec/s: 46 rss: 70Mb L: 43/83 MS: 1 ChangeBinInt- 00:07:33.920 [2024-11-29 09:30:56.739202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:33.920 [2024-11-29 09:30:56.739231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:33.920 [2024-11-29 09:30:56.739279] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:33.920 [2024-11-29 09:30:56.739296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.180 #47 NEW cov: 11893 ft: 15112 corp: 28/1356b lim: 85 exec/s: 47 rss: 70Mb L: 45/83 MS: 1 CMP- DE: "\000\000\001\000"- 00:07:34.180 [2024-11-29 09:30:56.789405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:07:34.180 [2024-11-29 09:30:56.789434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.180 [2024-11-29 09:30:56.789480] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:07:34.180 [2024-11-29 09:30:56.789497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.180 [2024-11-29 09:30:56.789531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:07:34.180 [2024-11-29 09:30:56.789547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.180 [2024-11-29 09:30:56.789574] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:07:34.180 [2024-11-29 09:30:56.789590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:34.180 #48 NEW cov: 11893 ft: 15113 corp: 29/1434b lim: 85 exec/s: 24 rss: 70Mb L: 78/83 MS: 1 ChangeByte- 00:07:34.180 #48 DONE cov: 11893 ft: 15113 corp: 29/1434b lim: 85 exec/s: 24 rss: 70Mb 00:07:34.180 ###### Recommended dictionary. ###### 00:07:34.180 "\017\000" # Uses: 4 00:07:34.180 "\000\000\001\000" # Uses: 0 00:07:34.180 ###### End of recommended dictionary. ###### 00:07:34.180 Done 48 runs in 2 second(s) 00:07:34.180 09:30:56 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:07:34.180 09:30:56 -- ../common.sh@72 -- # (( i++ )) 00:07:34.180 09:30:56 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.180 09:30:56 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:07:34.180 09:30:56 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:07:34.180 09:30:56 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.180 09:30:56 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.180 09:30:56 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:34.180 09:30:56 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:07:34.180 09:30:56 -- nvmf/run.sh@29 -- # printf %02d 23 00:07:34.180 09:30:56 -- nvmf/run.sh@29 -- # port=4423 00:07:34.180 09:30:56 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:34.180 09:30:56 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:07:34.180 09:30:56 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.180 09:30:56 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:07:34.180 [2024-11-29 09:30:57.004669] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:34.180 [2024-11-29 09:30:57.004740] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3189341 ] 00:07:34.440 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.440 [2024-11-29 09:30:57.179966] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.440 [2024-11-29 09:30:57.243009] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.440 [2024-11-29 09:30:57.243152] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.699 [2024-11-29 09:30:57.300972] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:34.700 [2024-11-29 09:30:57.317340] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:07:34.700 INFO: Running with entropic power schedule (0xFF, 100). 00:07:34.700 INFO: Seed: 4238936930 00:07:34.700 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:34.700 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:34.700 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:07:34.700 INFO: A corpus is not provided, starting from an empty corpus 00:07:34.700 #2 INITED exec/s: 0 rss: 60Mb 00:07:34.700 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:34.700 This may also happen if the target rejected all inputs we tried so far 00:07:34.700 [2024-11-29 09:30:57.386643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:34.700 [2024-11-29 09:30:57.386691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.700 [2024-11-29 09:30:57.386826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:34.700 [2024-11-29 09:30:57.386848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.960 NEW_FUNC[1/670]: 0x465118 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:07:34.960 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:34.960 #12 NEW cov: 11596 ft: 11596 corp: 2/13b lim: 25 exec/s: 0 rss: 68Mb L: 12/12 MS: 5 CopyPart-ChangeByte-ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:07:34.960 [2024-11-29 09:30:57.707674] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:34.960 [2024-11-29 09:30:57.707714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.960 [2024-11-29 09:30:57.707820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:34.960 [2024-11-29 09:30:57.707842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.960 NEW_FUNC[1/1]: 0x1c55e88 in _get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:328 00:07:34.960 #19 NEW cov: 11710 ft: 12153 corp: 3/26b lim: 25 exec/s: 0 rss: 68Mb L: 13/13 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:34.960 [2024-11-29 09:30:57.757870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:34.960 [2024-11-29 09:30:57.757903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:34.960 [2024-11-29 09:30:57.758036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:34.960 [2024-11-29 09:30:57.758059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:34.960 [2024-11-29 09:30:57.758178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:34.960 [2024-11-29 09:30:57.758200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:34.960 #20 NEW cov: 11716 ft: 12668 corp: 4/45b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:07:35.220 [2024-11-29 09:30:57.817887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.220 [2024-11-29 09:30:57.817916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.220 [2024-11-29 09:30:57.818059] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.220 [2024-11-29 09:30:57.818084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.220 #21 NEW cov: 11801 ft: 13002 corp: 5/57b lim: 25 exec/s: 0 rss: 68Mb L: 12/19 MS: 1 ChangeBinInt- 00:07:35.220 [2024-11-29 09:30:57.878197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.220 [2024-11-29 09:30:57.878229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.220 [2024-11-29 09:30:57.878364] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.220 [2024-11-29 09:30:57.878386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.220 [2024-11-29 09:30:57.878508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:35.220 [2024-11-29 09:30:57.878535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.220 #22 NEW cov: 11801 ft: 13075 corp: 6/76b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 CopyPart- 00:07:35.220 [2024-11-29 09:30:57.927825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.220 [2024-11-29 09:30:57.927853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.220 [2024-11-29 09:30:57.927980] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.220 [2024-11-29 09:30:57.927999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.220 #23 NEW cov: 11801 ft: 13166 corp: 7/89b lim: 25 exec/s: 0 rss: 68Mb L: 13/19 MS: 1 ShuffleBytes- 00:07:35.220 [2024-11-29 09:30:57.967908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.220 [2024-11-29 09:30:57.967935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.220 [2024-11-29 09:30:57.968070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.220 [2024-11-29 09:30:57.968096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.220 #25 NEW cov: 11801 ft: 13259 corp: 8/101b lim: 25 exec/s: 0 rss: 68Mb L: 12/19 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:07:35.220 [2024-11-29 09:30:58.018499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.220 [2024-11-29 09:30:58.018531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.220 [2024-11-29 09:30:58.018657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.220 [2024-11-29 09:30:58.018678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.220 #26 NEW cov: 11801 ft: 13294 corp: 9/113b lim: 25 exec/s: 0 rss: 68Mb L: 12/19 MS: 1 ChangeBit- 00:07:35.480 [2024-11-29 09:30:58.068834] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.480 [2024-11-29 09:30:58.068869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.480 [2024-11-29 09:30:58.068957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.481 [2024-11-29 09:30:58.068980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.069115] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:35.481 [2024-11-29 09:30:58.069138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.481 #27 NEW cov: 11801 ft: 13353 corp: 10/132b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:35.481 [2024-11-29 09:30:58.119017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.481 [2024-11-29 09:30:58.119052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.119123] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.481 [2024-11-29 09:30:58.119147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.119280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:35.481 [2024-11-29 09:30:58.119304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.481 #28 NEW cov: 11801 ft: 13399 corp: 11/151b lim: 25 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 ChangeByte- 00:07:35.481 [2024-11-29 09:30:58.168900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.481 [2024-11-29 09:30:58.168938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.169060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.481 [2024-11-29 09:30:58.169082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.481 #29 NEW cov: 11801 ft: 13438 corp: 12/163b lim: 25 exec/s: 0 rss: 68Mb L: 12/19 MS: 1 EraseBytes- 00:07:35.481 [2024-11-29 09:30:58.208388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.481 [2024-11-29 09:30:58.208414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.481 #30 NEW cov: 11801 ft: 13837 corp: 13/169b lim: 25 exec/s: 0 rss: 69Mb L: 6/19 MS: 1 InsertRepeatedBytes- 00:07:35.481 [2024-11-29 09:30:58.259306] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.481 [2024-11-29 09:30:58.259340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.259475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.481 [2024-11-29 09:30:58.259502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.259634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:35.481 [2024-11-29 09:30:58.259657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.481 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.481 #31 NEW cov: 11824 ft: 13944 corp: 14/188b lim: 25 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 ChangeByte- 00:07:35.481 [2024-11-29 09:30:58.319603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.481 [2024-11-29 09:30:58.319633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.319718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.481 [2024-11-29 09:30:58.319741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.481 [2024-11-29 09:30:58.319860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:35.481 [2024-11-29 09:30:58.319883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.741 #32 NEW cov: 11824 ft: 13951 corp: 15/207b lim: 25 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 ShuffleBytes- 00:07:35.741 [2024-11-29 09:30:58.358850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.741 [2024-11-29 09:30:58.358876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.741 #33 NEW cov: 11824 ft: 13994 corp: 16/213b lim: 25 exec/s: 33 rss: 69Mb L: 6/19 MS: 1 ChangeBit- 00:07:35.741 [2024-11-29 09:30:58.409628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.741 [2024-11-29 09:30:58.409655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.741 [2024-11-29 09:30:58.409803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.741 [2024-11-29 09:30:58.409828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.741 #34 NEW cov: 11824 ft: 14047 corp: 17/223b lim: 25 exec/s: 34 rss: 69Mb L: 10/19 MS: 1 CopyPart- 00:07:35.741 [2024-11-29 09:30:58.459594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.741 [2024-11-29 09:30:58.459628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.741 #35 NEW cov: 11824 ft: 14056 corp: 18/229b lim: 25 exec/s: 35 rss: 69Mb L: 6/19 MS: 1 ChangeBit- 00:07:35.741 [2024-11-29 09:30:58.510542] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.741 [2024-11-29 09:30:58.510576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.741 [2024-11-29 09:30:58.510675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.741 [2024-11-29 09:30:58.510713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:35.741 [2024-11-29 09:30:58.510835] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:35.741 [2024-11-29 09:30:58.510860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:35.741 [2024-11-29 09:30:58.510990] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:35.741 [2024-11-29 09:30:58.511013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:35.741 [2024-11-29 09:30:58.511144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:35.741 [2024-11-29 09:30:58.511169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:35.741 #36 NEW cov: 11824 ft: 14538 corp: 19/254b lim: 25 exec/s: 36 rss: 69Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:35.741 [2024-11-29 09:30:58.560163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:35.741 [2024-11-29 09:30:58.560190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:35.741 [2024-11-29 09:30:58.560328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:35.741 [2024-11-29 09:30:58.560347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.001 #37 NEW cov: 11824 ft: 14582 corp: 20/266b lim: 25 exec/s: 37 rss: 69Mb L: 12/25 MS: 1 ChangeBit- 00:07:36.001 [2024-11-29 09:30:58.610566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.001 [2024-11-29 09:30:58.610604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.610646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.001 [2024-11-29 09:30:58.610667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.610790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.001 [2024-11-29 09:30:58.610813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.001 #38 NEW cov: 11824 ft: 14649 corp: 21/285b lim: 25 exec/s: 38 rss: 69Mb L: 19/25 MS: 1 ShuffleBytes- 00:07:36.001 [2024-11-29 09:30:58.661080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.001 [2024-11-29 09:30:58.661111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.661190] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.001 [2024-11-29 09:30:58.661208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.661332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.001 [2024-11-29 09:30:58.661353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.661479] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:36.001 [2024-11-29 09:30:58.661501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.661621] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:36.001 [2024-11-29 09:30:58.661661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:36.001 #39 NEW cov: 11824 ft: 14661 corp: 22/310b lim: 25 exec/s: 39 rss: 69Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:36.001 [2024-11-29 09:30:58.710789] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.001 [2024-11-29 09:30:58.710822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.710946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.001 [2024-11-29 09:30:58.710967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.001 #40 NEW cov: 11824 ft: 14691 corp: 23/323b lim: 25 exec/s: 40 rss: 69Mb L: 13/25 MS: 1 ChangeByte- 00:07:36.001 [2024-11-29 09:30:58.761129] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.001 [2024-11-29 09:30:58.761161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.761243] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.001 [2024-11-29 09:30:58.761263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.761399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.001 [2024-11-29 09:30:58.761421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.001 #41 NEW cov: 11824 ft: 14706 corp: 24/342b lim: 25 exec/s: 41 rss: 69Mb L: 19/25 MS: 1 CMP- DE: "\001\223\317\344\251G\262j"- 00:07:36.001 [2024-11-29 09:30:58.800912] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.001 [2024-11-29 09:30:58.800945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.801065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.001 [2024-11-29 09:30:58.801086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.001 [2024-11-29 09:30:58.801213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.001 [2024-11-29 09:30:58.801237] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.001 #42 NEW cov: 11824 ft: 14758 corp: 25/361b lim: 25 exec/s: 42 rss: 69Mb L: 19/25 MS: 1 ShuffleBytes- 00:07:36.001 [2024-11-29 09:30:58.840847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.001 [2024-11-29 09:30:58.840877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 #43 NEW cov: 11824 ft: 14777 corp: 26/369b lim: 25 exec/s: 43 rss: 69Mb L: 8/25 MS: 1 EraseBytes- 00:07:36.261 [2024-11-29 09:30:58.880536] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.261 [2024-11-29 09:30:58.880572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 #44 NEW cov: 11824 ft: 14814 corp: 27/377b lim: 25 exec/s: 44 rss: 69Mb L: 8/25 MS: 1 ChangeBinInt- 00:07:36.261 [2024-11-29 09:30:58.921375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.261 [2024-11-29 09:30:58.921407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:58.921518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.261 [2024-11-29 09:30:58.921542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:58.921677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.261 [2024-11-29 09:30:58.921699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.261 #45 NEW cov: 11824 ft: 14919 corp: 28/396b lim: 25 exec/s: 45 rss: 69Mb L: 19/25 MS: 1 ChangeBit- 00:07:36.261 [2024-11-29 09:30:58.961805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.261 [2024-11-29 09:30:58.961835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:58.961933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.261 [2024-11-29 09:30:58.961954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:58.962077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.261 [2024-11-29 09:30:58.962102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:58.962225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:36.261 [2024-11-29 09:30:58.962246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.261 #46 NEW cov: 11824 ft: 14956 corp: 29/416b lim: 25 exec/s: 46 rss: 69Mb L: 20/25 MS: 1 CrossOver- 00:07:36.261 [2024-11-29 09:30:59.001343] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.261 [2024-11-29 09:30:59.001368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:59.001501] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.261 [2024-11-29 09:30:59.001524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.261 #47 NEW cov: 11824 ft: 14984 corp: 30/429b lim: 25 exec/s: 47 rss: 69Mb L: 13/25 MS: 1 InsertByte- 00:07:36.261 [2024-11-29 09:30:59.041402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.261 [2024-11-29 09:30:59.041436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:59.041562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.261 [2024-11-29 09:30:59.041586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:59.041711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.261 [2024-11-29 09:30:59.041736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.261 #48 NEW cov: 11824 ft: 15015 corp: 31/448b lim: 25 exec/s: 48 rss: 69Mb L: 19/25 MS: 1 CMP- DE: "\000\000\377\377"- 00:07:36.261 [2024-11-29 09:30:59.091799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.261 [2024-11-29 09:30:59.091826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.261 [2024-11-29 09:30:59.091967] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.261 [2024-11-29 09:30:59.091992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.521 #49 NEW cov: 11824 ft: 15036 corp: 32/460b lim: 25 exec/s: 49 rss: 69Mb L: 12/25 MS: 1 ChangeByte- 00:07:36.521 [2024-11-29 09:30:59.152091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.521 [2024-11-29 09:30:59.152126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.521 [2024-11-29 09:30:59.152222] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.521 [2024-11-29 09:30:59.152246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.521 #50 NEW cov: 11824 ft: 15045 corp: 33/470b lim: 25 exec/s: 50 rss: 70Mb L: 10/25 MS: 1 PersAutoDict- DE: "\000\000\377\377"- 00:07:36.521 [2024-11-29 09:30:59.202132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.521 [2024-11-29 09:30:59.202164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.521 [2024-11-29 09:30:59.202294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.521 [2024-11-29 09:30:59.202318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.521 #51 NEW cov: 11824 ft: 15060 corp: 34/480b lim: 25 exec/s: 51 rss: 70Mb L: 10/25 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:36.521 [2024-11-29 09:30:59.252539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.521 [2024-11-29 09:30:59.252572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.521 [2024-11-29 09:30:59.252705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.521 [2024-11-29 09:30:59.252729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.521 [2024-11-29 09:30:59.252857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.521 [2024-11-29 09:30:59.252878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.521 #52 NEW cov: 11824 ft: 15073 corp: 35/499b lim: 25 exec/s: 52 rss: 70Mb L: 19/25 MS: 1 ChangeBit- 00:07:36.521 [2024-11-29 09:30:59.312675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.521 [2024-11-29 09:30:59.312706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.521 [2024-11-29 09:30:59.312851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.521 [2024-11-29 09:30:59.312876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.521 [2024-11-29 09:30:59.313011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.521 [2024-11-29 09:30:59.313035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.521 #53 NEW cov: 11824 ft: 15074 corp: 36/518b lim: 25 exec/s: 53 rss: 70Mb L: 19/25 MS: 1 ChangeBinInt- 00:07:36.522 [2024-11-29 09:30:59.353164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:07:36.522 [2024-11-29 09:30:59.353195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:36.522 [2024-11-29 09:30:59.353293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:07:36.522 [2024-11-29 09:30:59.353316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:36.522 [2024-11-29 09:30:59.353444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:07:36.522 [2024-11-29 09:30:59.353467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:36.522 [2024-11-29 09:30:59.353593] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:07:36.522 [2024-11-29 09:30:59.353618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:36.522 [2024-11-29 09:30:59.353749] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:07:36.522 [2024-11-29 09:30:59.353772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:07:36.781 #54 NEW cov: 11824 ft: 15083 corp: 37/543b lim: 25 exec/s: 27 rss: 70Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:07:36.781 #54 DONE cov: 11824 ft: 15083 corp: 37/543b lim: 25 exec/s: 27 rss: 70Mb 00:07:36.781 ###### Recommended dictionary. ###### 00:07:36.781 "\001\223\317\344\251G\262j" # Uses: 0 00:07:36.781 "\000\000\377\377" # Uses: 1 00:07:36.781 "\000\000\000\000" # Uses: 0 00:07:36.781 ###### End of recommended dictionary. ###### 00:07:36.781 Done 54 runs in 2 second(s) 00:07:36.781 09:30:59 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:07:36.781 09:30:59 -- ../common.sh@72 -- # (( i++ )) 00:07:36.781 09:30:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:36.781 09:30:59 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:07:36.781 09:30:59 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:07:36.781 09:30:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:36.781 09:30:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:36.781 09:30:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:36.781 09:30:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:07:36.781 09:30:59 -- nvmf/run.sh@29 -- # printf %02d 24 00:07:36.781 09:30:59 -- nvmf/run.sh@29 -- # port=4424 00:07:36.781 09:30:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:36.781 09:30:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:07:36.781 09:30:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:36.781 09:30:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:07:36.781 [2024-11-29 09:30:59.546981] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:36.781 [2024-11-29 09:30:59.547059] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3189813 ] 00:07:36.781 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.040 [2024-11-29 09:30:59.734294] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.040 [2024-11-29 09:30:59.797497] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.040 [2024-11-29 09:30:59.797648] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.040 [2024-11-29 09:30:59.855388] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.040 [2024-11-29 09:30:59.871733] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:07:37.300 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.300 INFO: Seed: 2500967014 00:07:37.300 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:37.300 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:37.300 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:07:37.300 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.300 #2 INITED exec/s: 0 rss: 60Mb 00:07:37.300 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.300 This may also happen if the target rejected all inputs we tried so far 00:07:37.300 [2024-11-29 09:30:59.947644] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.300 [2024-11-29 09:30:59.947682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.300 [2024-11-29 09:30:59.947811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.300 [2024-11-29 09:30:59.947832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.559 NEW_FUNC[1/672]: 0x466208 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:07:37.559 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.559 #44 NEW cov: 11670 ft: 11670 corp: 2/43b lim: 100 exec/s: 0 rss: 68Mb L: 42/42 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:37.559 [2024-11-29 09:31:00.268629] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.268668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.560 [2024-11-29 09:31:00.268786] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.268808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.560 #50 NEW cov: 11783 ft: 12102 corp: 3/101b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 CopyPart- 00:07:37.560 [2024-11-29 09:31:00.318696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.318730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.560 [2024-11-29 09:31:00.318847] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.318870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.560 #51 NEW cov: 11789 ft: 12460 corp: 4/159b lim: 100 exec/s: 0 rss: 68Mb L: 58/58 MS: 1 CrossOver- 00:07:37.560 [2024-11-29 09:31:00.358977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.359012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.560 [2024-11-29 09:31:00.359138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.359158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.560 [2024-11-29 09:31:00.359281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.359302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.560 #53 NEW cov: 11874 ft: 13160 corp: 5/219b lim: 100 exec/s: 0 rss: 68Mb L: 60/60 MS: 2 ChangeByte-InsertRepeatedBytes- 00:07:37.560 [2024-11-29 09:31:00.398877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.398910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.560 [2024-11-29 09:31:00.399030] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.560 [2024-11-29 09:31:00.399054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 #59 NEW cov: 11874 ft: 13232 corp: 6/261b lim: 100 exec/s: 0 rss: 68Mb L: 42/60 MS: 1 CMP- DE: "\000\000\000\001"- 00:07:37.820 [2024-11-29 09:31:00.439263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.439297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.439409] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.439433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.439552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.439575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.820 #60 NEW cov: 11874 ft: 13284 corp: 7/337b lim: 100 exec/s: 0 rss: 68Mb L: 76/76 MS: 1 InsertRepeatedBytes- 00:07:37.820 [2024-11-29 09:31:00.479114] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.479141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.479259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.479283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 #61 NEW cov: 11874 ft: 13412 corp: 8/380b lim: 100 exec/s: 0 rss: 68Mb L: 43/76 MS: 1 InsertByte- 00:07:37.820 [2024-11-29 09:31:00.519537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.519570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.519689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.519710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.519833] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816972196028858303 len:58801 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.519850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:37.820 #62 NEW cov: 11874 ft: 13441 corp: 9/440b lim: 100 exec/s: 0 rss: 68Mb L: 60/76 MS: 1 CMP- DE: "\001\223\317\345\260=\024f"- 00:07:37.820 [2024-11-29 09:31:00.559398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.559428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.559540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.559562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 #64 NEW cov: 11874 ft: 13539 corp: 10/486b lim: 100 exec/s: 0 rss: 68Mb L: 46/76 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:37.820 [2024-11-29 09:31:00.599521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551406 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.599550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.599634] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.599653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 #65 NEW cov: 11874 ft: 13645 corp: 11/532b lim: 100 exec/s: 0 rss: 68Mb L: 46/76 MS: 1 ChangeBinInt- 00:07:37.820 [2024-11-29 09:31:00.639588] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.639624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:37.820 [2024-11-29 09:31:00.639744] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.820 [2024-11-29 09:31:00.639766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:37.820 #66 NEW cov: 11874 ft: 13678 corp: 12/575b lim: 100 exec/s: 0 rss: 68Mb L: 43/76 MS: 1 InsertByte- 00:07:38.080 [2024-11-29 09:31:00.679706] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.080 [2024-11-29 09:31:00.679740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.080 [2024-11-29 09:31:00.679844] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14980573511557317011 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.080 [2024-11-29 09:31:00.679865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.080 #67 NEW cov: 11874 ft: 13778 corp: 13/618b lim: 100 exec/s: 0 rss: 69Mb L: 43/76 MS: 1 PersAutoDict- DE: "\001\223\317\345\260=\024f"- 00:07:38.080 [2024-11-29 09:31:00.720237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.080 [2024-11-29 09:31:00.720270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.720384] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.720407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.720528] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.720551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.081 #68 NEW cov: 11874 ft: 13789 corp: 14/678b lim: 100 exec/s: 0 rss: 69Mb L: 60/76 MS: 1 ChangeByte- 00:07:38.081 [2024-11-29 09:31:00.760310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.760339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.760455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.760474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.760591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.760619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.081 #74 NEW cov: 11874 ft: 13847 corp: 15/739b lim: 100 exec/s: 0 rss: 69Mb L: 61/76 MS: 1 InsertByte- 00:07:38.081 [2024-11-29 09:31:00.810085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.810120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.810252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.810275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.081 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.081 #75 NEW cov: 11897 ft: 13930 corp: 16/781b lim: 100 exec/s: 0 rss: 69Mb L: 42/76 MS: 1 ChangeByte- 00:07:38.081 [2024-11-29 09:31:00.850227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.850253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.850379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.850404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.081 #76 NEW cov: 11897 ft: 13962 corp: 17/823b lim: 100 exec/s: 0 rss: 69Mb L: 42/76 MS: 1 PersAutoDict- DE: "\001\223\317\345\260=\024f"- 00:07:38.081 [2024-11-29 09:31:00.900478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.900509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.081 [2024-11-29 09:31:00.900639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:14980573511557317011 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.081 [2024-11-29 09:31:00.900663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.340 #77 NEW cov: 11897 ft: 13981 corp: 18/866b lim: 100 exec/s: 77 rss: 69Mb L: 43/76 MS: 1 CMP- DE: "\377\377\001\000"- 00:07:38.340 [2024-11-29 09:31:00.950603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.340 [2024-11-29 09:31:00.950636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.340 [2024-11-29 09:31:00.950770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.340 [2024-11-29 09:31:00.950792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.340 #78 NEW cov: 11897 ft: 13995 corp: 19/909b lim: 100 exec/s: 78 rss: 69Mb L: 43/76 MS: 1 PersAutoDict- DE: "\000\000\000\001"- 00:07:38.340 [2024-11-29 09:31:00.990439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.340 [2024-11-29 09:31:00.990466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.340 #79 NEW cov: 11897 ft: 14815 corp: 20/942b lim: 100 exec/s: 79 rss: 69Mb L: 33/76 MS: 1 EraseBytes- 00:07:38.340 [2024-11-29 09:31:01.040824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.340 [2024-11-29 09:31:01.040854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.340 [2024-11-29 09:31:01.040978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.340 [2024-11-29 09:31:01.040999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.340 #80 NEW cov: 11897 ft: 14849 corp: 21/985b lim: 100 exec/s: 80 rss: 69Mb L: 43/76 MS: 1 CopyPart- 00:07:38.341 [2024-11-29 09:31:01.091216] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.091252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.341 [2024-11-29 09:31:01.091367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18387915803577024511 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.091388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.341 [2024-11-29 09:31:01.091506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.091531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.341 #81 NEW cov: 11897 ft: 14902 corp: 22/1055b lim: 100 exec/s: 81 rss: 69Mb L: 70/76 MS: 1 CrossOver- 00:07:38.341 [2024-11-29 09:31:01.141414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.141452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.341 [2024-11-29 09:31:01.141558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.141580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.341 [2024-11-29 09:31:01.141696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.141716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.341 #83 NEW cov: 11897 ft: 14906 corp: 23/1117b lim: 100 exec/s: 83 rss: 69Mb L: 62/76 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:38.341 [2024-11-29 09:31:01.181519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.181549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.341 [2024-11-29 09:31:01.181656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.181680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.341 [2024-11-29 09:31:01.181796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816972196028858303 len:58801 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.341 [2024-11-29 09:31:01.181817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.600 #84 NEW cov: 11897 ft: 14934 corp: 24/1177b lim: 100 exec/s: 84 rss: 69Mb L: 60/76 MS: 1 ChangeBit- 00:07:38.600 [2024-11-29 09:31:01.231345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.600 [2024-11-29 09:31:01.231377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.600 [2024-11-29 09:31:01.231488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:281474976710656 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.600 [2024-11-29 09:31:01.231517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.600 #85 NEW cov: 11897 ft: 14942 corp: 25/1220b lim: 100 exec/s: 85 rss: 69Mb L: 43/76 MS: 1 CopyPart- 00:07:38.600 [2024-11-29 09:31:01.271665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.600 [2024-11-29 09:31:01.271694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.600 [2024-11-29 09:31:01.271802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072623295 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.600 [2024-11-29 09:31:01.271824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.600 [2024-11-29 09:31:01.271938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.600 [2024-11-29 09:31:01.271958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.601 #86 NEW cov: 11897 ft: 14964 corp: 26/1280b lim: 100 exec/s: 86 rss: 69Mb L: 60/76 MS: 1 ChangeByte- 00:07:38.601 [2024-11-29 09:31:01.311268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.311302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.601 #87 NEW cov: 11897 ft: 14975 corp: 27/1310b lim: 100 exec/s: 87 rss: 69Mb L: 30/76 MS: 1 CrossOver- 00:07:38.601 [2024-11-29 09:31:01.351955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.351990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.601 [2024-11-29 09:31:01.352107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.352131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.601 [2024-11-29 09:31:01.352246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:361700864190383423 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.352268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.601 #88 NEW cov: 11897 ft: 14987 corp: 28/1373b lim: 100 exec/s: 88 rss: 70Mb L: 63/76 MS: 1 InsertByte- 00:07:38.601 [2024-11-29 09:31:01.401863] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.401896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.601 [2024-11-29 09:31:01.401992] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.402013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.601 #89 NEW cov: 11897 ft: 15009 corp: 29/1415b lim: 100 exec/s: 89 rss: 70Mb L: 42/76 MS: 1 ChangeByte- 00:07:38.601 [2024-11-29 09:31:01.442000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.442039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.601 [2024-11-29 09:31:01.442124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.601 [2024-11-29 09:31:01.442146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.861 #90 NEW cov: 11897 ft: 15016 corp: 30/1457b lim: 100 exec/s: 90 rss: 70Mb L: 42/76 MS: 1 EraseBytes- 00:07:38.861 [2024-11-29 09:31:01.482105] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2617245696 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.482140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.861 [2024-11-29 09:31:01.482254] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.482277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.861 #91 NEW cov: 11897 ft: 15032 corp: 31/1500b lim: 100 exec/s: 91 rss: 70Mb L: 43/76 MS: 1 InsertByte- 00:07:38.861 [2024-11-29 09:31:01.522233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16550795786695316431 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.522267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.861 [2024-11-29 09:31:01.522385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.522406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.861 #92 NEW cov: 11897 ft: 15043 corp: 32/1542b lim: 100 exec/s: 92 rss: 70Mb L: 42/76 MS: 1 PersAutoDict- DE: "\001\223\317\345\260=\024f"- 00:07:38.861 [2024-11-29 09:31:01.562334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18302628885633695744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.562360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.861 [2024-11-29 09:31:01.562475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.562495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.861 #93 NEW cov: 11897 ft: 15051 corp: 33/1584b lim: 100 exec/s: 93 rss: 70Mb L: 42/76 MS: 1 ChangeBinInt- 00:07:38.861 [2024-11-29 09:31:01.602585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:2617245696 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.602622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.861 [2024-11-29 09:31:01.602711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.602731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.861 [2024-11-29 09:31:01.602855] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:59 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.602874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.861 #94 NEW cov: 11897 ft: 15058 corp: 34/1649b lim: 100 exec/s: 94 rss: 70Mb L: 65/76 MS: 1 InsertRepeatedBytes- 00:07:38.861 [2024-11-29 09:31:01.643099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.861 [2024-11-29 09:31:01.643130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.862 [2024-11-29 09:31:01.643226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.862 [2024-11-29 09:31:01.643249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:38.862 [2024-11-29 09:31:01.643366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.862 [2024-11-29 09:31:01.643390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:38.862 [2024-11-29 09:31:01.643501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.862 [2024-11-29 09:31:01.643528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:07:38.862 #95 NEW cov: 11897 ft: 15392 corp: 35/1734b lim: 100 exec/s: 95 rss: 70Mb L: 85/85 MS: 1 InsertRepeatedBytes- 00:07:38.862 [2024-11-29 09:31:01.692679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.862 [2024-11-29 09:31:01.692710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:38.862 [2024-11-29 09:31:01.692824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.862 [2024-11-29 09:31:01.692848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.121 #96 NEW cov: 11897 ft: 15426 corp: 36/1777b lim: 100 exec/s: 96 rss: 70Mb L: 43/85 MS: 1 CrossOver- 00:07:39.121 [2024-11-29 09:31:01.732825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:361700864106168325 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.732857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.732979] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:361700864190383365 len:1286 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.733000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.121 #97 NEW cov: 11897 ft: 15438 corp: 37/1817b lim: 100 exec/s: 97 rss: 70Mb L: 40/85 MS: 1 CrossOver- 00:07:39.121 [2024-11-29 09:31:01.772623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.772656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.121 #98 NEW cov: 11897 ft: 15471 corp: 38/1850b lim: 100 exec/s: 98 rss: 70Mb L: 33/85 MS: 1 ChangeBinInt- 00:07:39.121 [2024-11-29 09:31:01.813346] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.813376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.813472] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072623295 len:65282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.813496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.813605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.813628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.121 #99 NEW cov: 11897 ft: 15503 corp: 39/1914b lim: 100 exec/s: 99 rss: 70Mb L: 64/85 MS: 1 PersAutoDict- DE: "\377\377\001\000"- 00:07:39.121 [2024-11-29 09:31:01.853518] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816973010982125503 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.853548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.853658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.853679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.853793] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:7403847143333575956 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.853815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.121 #100 NEW cov: 11897 ft: 15571 corp: 40/1975b lim: 100 exec/s: 100 rss: 70Mb L: 61/85 MS: 1 PersAutoDict- DE: "\001\223\317\345\260=\024f"- 00:07:39.121 [2024-11-29 09:31:01.893613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13816972465521278911 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.893645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.893703] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13816973012072623295 len:65282 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.893721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:07:39.121 [2024-11-29 09:31:01.893836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13816973012072644543 len:49088 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.121 [2024-11-29 09:31:01.893856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:07:39.121 #101 NEW cov: 11897 ft: 15627 corp: 41/2039b lim: 100 exec/s: 50 rss: 70Mb L: 64/85 MS: 1 ChangeBinInt- 00:07:39.121 #101 DONE cov: 11897 ft: 15627 corp: 41/2039b lim: 100 exec/s: 50 rss: 70Mb 00:07:39.121 ###### Recommended dictionary. ###### 00:07:39.121 "\000\000\000\001" # Uses: 1 00:07:39.121 "\001\223\317\345\260=\024f" # Uses: 5 00:07:39.121 "\377\377\001\000" # Uses: 1 00:07:39.121 ###### End of recommended dictionary. ###### 00:07:39.121 Done 101 runs in 2 second(s) 00:07:39.380 09:31:02 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:07:39.380 09:31:02 -- ../common.sh@72 -- # (( i++ )) 00:07:39.380 09:31:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.380 09:31:02 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:07:39.380 00:07:39.380 real 1m5.300s 00:07:39.380 user 1m40.771s 00:07:39.380 sys 0m8.089s 00:07:39.380 09:31:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:39.380 09:31:02 -- common/autotest_common.sh@10 -- # set +x 00:07:39.380 ************************************ 00:07:39.380 END TEST nvmf_fuzz 00:07:39.380 ************************************ 00:07:39.380 09:31:02 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:39.380 09:31:02 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:39.380 09:31:02 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:39.380 09:31:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:39.380 09:31:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:39.380 09:31:02 -- common/autotest_common.sh@10 -- # set +x 00:07:39.380 ************************************ 00:07:39.380 START TEST vfio_fuzz 00:07:39.380 ************************************ 00:07:39.380 09:31:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:07:39.380 * Looking for test storage... 00:07:39.380 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:39.380 09:31:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:39.380 09:31:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:39.380 09:31:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:39.642 09:31:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:39.642 09:31:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:39.642 09:31:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:39.642 09:31:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:39.642 09:31:02 -- scripts/common.sh@335 -- # IFS=.-: 00:07:39.642 09:31:02 -- scripts/common.sh@335 -- # read -ra ver1 00:07:39.642 09:31:02 -- scripts/common.sh@336 -- # IFS=.-: 00:07:39.642 09:31:02 -- scripts/common.sh@336 -- # read -ra ver2 00:07:39.642 09:31:02 -- scripts/common.sh@337 -- # local 'op=<' 00:07:39.642 09:31:02 -- scripts/common.sh@339 -- # ver1_l=2 00:07:39.642 09:31:02 -- scripts/common.sh@340 -- # ver2_l=1 00:07:39.642 09:31:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:39.642 09:31:02 -- scripts/common.sh@343 -- # case "$op" in 00:07:39.642 09:31:02 -- scripts/common.sh@344 -- # : 1 00:07:39.642 09:31:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:39.642 09:31:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:39.642 09:31:02 -- scripts/common.sh@364 -- # decimal 1 00:07:39.642 09:31:02 -- scripts/common.sh@352 -- # local d=1 00:07:39.642 09:31:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:39.642 09:31:02 -- scripts/common.sh@354 -- # echo 1 00:07:39.642 09:31:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:39.642 09:31:02 -- scripts/common.sh@365 -- # decimal 2 00:07:39.642 09:31:02 -- scripts/common.sh@352 -- # local d=2 00:07:39.642 09:31:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:39.642 09:31:02 -- scripts/common.sh@354 -- # echo 2 00:07:39.642 09:31:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:39.642 09:31:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:39.642 09:31:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:39.642 09:31:02 -- scripts/common.sh@367 -- # return 0 00:07:39.642 09:31:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:39.642 09:31:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:39.642 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.642 --rc genhtml_branch_coverage=1 00:07:39.642 --rc genhtml_function_coverage=1 00:07:39.642 --rc genhtml_legend=1 00:07:39.642 --rc geninfo_all_blocks=1 00:07:39.642 --rc geninfo_unexecuted_blocks=1 00:07:39.642 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.642 ' 00:07:39.642 09:31:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:39.642 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.642 --rc genhtml_branch_coverage=1 00:07:39.642 --rc genhtml_function_coverage=1 00:07:39.642 --rc genhtml_legend=1 00:07:39.642 --rc geninfo_all_blocks=1 00:07:39.642 --rc geninfo_unexecuted_blocks=1 00:07:39.642 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.642 ' 00:07:39.642 09:31:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:39.642 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.642 --rc genhtml_branch_coverage=1 00:07:39.642 --rc genhtml_function_coverage=1 00:07:39.642 --rc genhtml_legend=1 00:07:39.642 --rc geninfo_all_blocks=1 00:07:39.642 --rc geninfo_unexecuted_blocks=1 00:07:39.642 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.642 ' 00:07:39.642 09:31:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:39.642 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.642 --rc genhtml_branch_coverage=1 00:07:39.642 --rc genhtml_function_coverage=1 00:07:39.642 --rc genhtml_legend=1 00:07:39.642 --rc geninfo_all_blocks=1 00:07:39.642 --rc geninfo_unexecuted_blocks=1 00:07:39.642 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.642 ' 00:07:39.642 09:31:02 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:39.642 09:31:02 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:39.642 09:31:02 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:39.642 09:31:02 -- common/autotest_common.sh@34 -- # set -e 00:07:39.642 09:31:02 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:39.642 09:31:02 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:39.642 09:31:02 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:39.642 09:31:02 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:39.642 09:31:02 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:39.642 09:31:02 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:39.642 09:31:02 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:39.642 09:31:02 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:39.642 09:31:02 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:39.642 09:31:02 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:39.642 09:31:02 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:39.642 09:31:02 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:39.642 09:31:02 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:39.642 09:31:02 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:39.642 09:31:02 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:39.642 09:31:02 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:39.642 09:31:02 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:39.642 09:31:02 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:39.642 09:31:02 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:39.642 09:31:02 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:39.642 09:31:02 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:39.642 09:31:02 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:39.642 09:31:02 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:39.642 09:31:02 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:39.642 09:31:02 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:39.642 09:31:02 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:39.642 09:31:02 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:39.642 09:31:02 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:39.642 09:31:02 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:39.642 09:31:02 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:39.642 09:31:02 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:39.642 09:31:02 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:39.642 09:31:02 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:39.642 09:31:02 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:39.642 09:31:02 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:39.642 09:31:02 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:39.642 09:31:02 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:39.642 09:31:02 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:39.642 09:31:02 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:39.642 09:31:02 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:39.642 09:31:02 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:39.642 09:31:02 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:39.642 09:31:02 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:39.642 09:31:02 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:39.642 09:31:02 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:39.642 09:31:02 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:39.642 09:31:02 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:39.642 09:31:02 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:39.642 09:31:02 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:39.642 09:31:02 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:39.642 09:31:02 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:39.642 09:31:02 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:39.642 09:31:02 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:39.642 09:31:02 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:39.642 09:31:02 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:39.642 09:31:02 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:39.642 09:31:02 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:39.642 09:31:02 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:39.642 09:31:02 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:39.642 09:31:02 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:39.642 09:31:02 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:39.642 09:31:02 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:39.642 09:31:02 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:39.642 09:31:02 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:39.642 09:31:02 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:39.642 09:31:02 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:39.642 09:31:02 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:39.642 09:31:02 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:39.642 09:31:02 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:39.643 09:31:02 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:39.643 09:31:02 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:39.643 09:31:02 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:39.643 09:31:02 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:39.643 09:31:02 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:39.643 09:31:02 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:39.643 09:31:02 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:39.643 09:31:02 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:39.643 09:31:02 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:39.643 09:31:02 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:39.643 09:31:02 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:39.643 09:31:02 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:39.643 09:31:02 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:39.643 09:31:02 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:39.643 09:31:02 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:39.643 09:31:02 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:39.643 09:31:02 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:39.643 09:31:02 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:39.643 09:31:02 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:39.643 09:31:02 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:39.643 09:31:02 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:39.643 09:31:02 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:39.643 09:31:02 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:39.643 09:31:02 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:39.643 #define SPDK_CONFIG_H 00:07:39.643 #define SPDK_CONFIG_APPS 1 00:07:39.643 #define SPDK_CONFIG_ARCH native 00:07:39.643 #undef SPDK_CONFIG_ASAN 00:07:39.643 #undef SPDK_CONFIG_AVAHI 00:07:39.643 #undef SPDK_CONFIG_CET 00:07:39.643 #define SPDK_CONFIG_COVERAGE 1 00:07:39.643 #define SPDK_CONFIG_CROSS_PREFIX 00:07:39.643 #undef SPDK_CONFIG_CRYPTO 00:07:39.643 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:39.643 #undef SPDK_CONFIG_CUSTOMOCF 00:07:39.643 #undef SPDK_CONFIG_DAOS 00:07:39.643 #define SPDK_CONFIG_DAOS_DIR 00:07:39.643 #define SPDK_CONFIG_DEBUG 1 00:07:39.643 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:39.643 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:39.643 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:39.643 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:39.643 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:39.643 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:39.643 #define SPDK_CONFIG_EXAMPLES 1 00:07:39.643 #undef SPDK_CONFIG_FC 00:07:39.643 #define SPDK_CONFIG_FC_PATH 00:07:39.643 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:39.643 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:39.643 #undef SPDK_CONFIG_FUSE 00:07:39.643 #define SPDK_CONFIG_FUZZER 1 00:07:39.643 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:39.643 #undef SPDK_CONFIG_GOLANG 00:07:39.643 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:39.643 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:39.643 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:39.643 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:39.643 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:39.643 #define SPDK_CONFIG_IDXD 1 00:07:39.643 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:39.643 #undef SPDK_CONFIG_IPSEC_MB 00:07:39.643 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:39.643 #define SPDK_CONFIG_ISAL 1 00:07:39.643 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:39.643 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:39.643 #define SPDK_CONFIG_LIBDIR 00:07:39.643 #undef SPDK_CONFIG_LTO 00:07:39.643 #define SPDK_CONFIG_MAX_LCORES 00:07:39.643 #define SPDK_CONFIG_NVME_CUSE 1 00:07:39.643 #undef SPDK_CONFIG_OCF 00:07:39.643 #define SPDK_CONFIG_OCF_PATH 00:07:39.643 #define SPDK_CONFIG_OPENSSL_PATH 00:07:39.643 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:39.643 #undef SPDK_CONFIG_PGO_USE 00:07:39.643 #define SPDK_CONFIG_PREFIX /usr/local 00:07:39.643 #undef SPDK_CONFIG_RAID5F 00:07:39.643 #undef SPDK_CONFIG_RBD 00:07:39.643 #define SPDK_CONFIG_RDMA 1 00:07:39.643 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:39.643 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:39.643 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:39.643 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:39.643 #undef SPDK_CONFIG_SHARED 00:07:39.643 #undef SPDK_CONFIG_SMA 00:07:39.643 #define SPDK_CONFIG_TESTS 1 00:07:39.643 #undef SPDK_CONFIG_TSAN 00:07:39.643 #define SPDK_CONFIG_UBLK 1 00:07:39.643 #define SPDK_CONFIG_UBSAN 1 00:07:39.643 #undef SPDK_CONFIG_UNIT_TESTS 00:07:39.643 #undef SPDK_CONFIG_URING 00:07:39.643 #define SPDK_CONFIG_URING_PATH 00:07:39.643 #undef SPDK_CONFIG_URING_ZNS 00:07:39.643 #undef SPDK_CONFIG_USDT 00:07:39.643 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:39.643 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:39.643 #define SPDK_CONFIG_VFIO_USER 1 00:07:39.643 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:39.643 #define SPDK_CONFIG_VHOST 1 00:07:39.643 #define SPDK_CONFIG_VIRTIO 1 00:07:39.643 #undef SPDK_CONFIG_VTUNE 00:07:39.643 #define SPDK_CONFIG_VTUNE_DIR 00:07:39.643 #define SPDK_CONFIG_WERROR 1 00:07:39.643 #define SPDK_CONFIG_WPDK_DIR 00:07:39.643 #undef SPDK_CONFIG_XNVME 00:07:39.643 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:39.643 09:31:02 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:39.643 09:31:02 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:39.643 09:31:02 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:39.643 09:31:02 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:39.643 09:31:02 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:39.643 09:31:02 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.643 09:31:02 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.643 09:31:02 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.643 09:31:02 -- paths/export.sh@5 -- # export PATH 00:07:39.643 09:31:02 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:39.643 09:31:02 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:39.643 09:31:02 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:39.643 09:31:02 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:39.643 09:31:02 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:39.643 09:31:02 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:39.643 09:31:02 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:39.643 09:31:02 -- pm/common@16 -- # TEST_TAG=N/A 00:07:39.643 09:31:02 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:39.643 09:31:02 -- common/autotest_common.sh@52 -- # : 1 00:07:39.643 09:31:02 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:39.643 09:31:02 -- common/autotest_common.sh@56 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:39.643 09:31:02 -- common/autotest_common.sh@58 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:39.643 09:31:02 -- common/autotest_common.sh@60 -- # : 1 00:07:39.643 09:31:02 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:39.643 09:31:02 -- common/autotest_common.sh@62 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:39.643 09:31:02 -- common/autotest_common.sh@64 -- # : 00:07:39.643 09:31:02 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:39.643 09:31:02 -- common/autotest_common.sh@66 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:39.643 09:31:02 -- common/autotest_common.sh@68 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:39.643 09:31:02 -- common/autotest_common.sh@70 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:39.643 09:31:02 -- common/autotest_common.sh@72 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:39.643 09:31:02 -- common/autotest_common.sh@74 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:39.643 09:31:02 -- common/autotest_common.sh@76 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:39.643 09:31:02 -- common/autotest_common.sh@78 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:39.643 09:31:02 -- common/autotest_common.sh@80 -- # : 0 00:07:39.643 09:31:02 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:39.643 09:31:02 -- common/autotest_common.sh@82 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:39.644 09:31:02 -- common/autotest_common.sh@84 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:39.644 09:31:02 -- common/autotest_common.sh@86 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:39.644 09:31:02 -- common/autotest_common.sh@88 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:39.644 09:31:02 -- common/autotest_common.sh@90 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:39.644 09:31:02 -- common/autotest_common.sh@92 -- # : 1 00:07:39.644 09:31:02 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:39.644 09:31:02 -- common/autotest_common.sh@94 -- # : 1 00:07:39.644 09:31:02 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:39.644 09:31:02 -- common/autotest_common.sh@96 -- # : rdma 00:07:39.644 09:31:02 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:39.644 09:31:02 -- common/autotest_common.sh@98 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:39.644 09:31:02 -- common/autotest_common.sh@100 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:39.644 09:31:02 -- common/autotest_common.sh@102 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:39.644 09:31:02 -- common/autotest_common.sh@104 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:39.644 09:31:02 -- common/autotest_common.sh@106 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:39.644 09:31:02 -- common/autotest_common.sh@108 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:39.644 09:31:02 -- common/autotest_common.sh@110 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:39.644 09:31:02 -- common/autotest_common.sh@112 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:39.644 09:31:02 -- common/autotest_common.sh@114 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:39.644 09:31:02 -- common/autotest_common.sh@116 -- # : 1 00:07:39.644 09:31:02 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:39.644 09:31:02 -- common/autotest_common.sh@118 -- # : 00:07:39.644 09:31:02 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:39.644 09:31:02 -- common/autotest_common.sh@120 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:39.644 09:31:02 -- common/autotest_common.sh@122 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:39.644 09:31:02 -- common/autotest_common.sh@124 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:39.644 09:31:02 -- common/autotest_common.sh@126 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:39.644 09:31:02 -- common/autotest_common.sh@128 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:39.644 09:31:02 -- common/autotest_common.sh@130 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:39.644 09:31:02 -- common/autotest_common.sh@132 -- # : 00:07:39.644 09:31:02 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:39.644 09:31:02 -- common/autotest_common.sh@134 -- # : true 00:07:39.644 09:31:02 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:39.644 09:31:02 -- common/autotest_common.sh@136 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:39.644 09:31:02 -- common/autotest_common.sh@138 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:39.644 09:31:02 -- common/autotest_common.sh@140 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:39.644 09:31:02 -- common/autotest_common.sh@142 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:39.644 09:31:02 -- common/autotest_common.sh@144 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:39.644 09:31:02 -- common/autotest_common.sh@146 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:39.644 09:31:02 -- common/autotest_common.sh@148 -- # : 00:07:39.644 09:31:02 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:39.644 09:31:02 -- common/autotest_common.sh@150 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:39.644 09:31:02 -- common/autotest_common.sh@152 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:39.644 09:31:02 -- common/autotest_common.sh@154 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:39.644 09:31:02 -- common/autotest_common.sh@156 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:39.644 09:31:02 -- common/autotest_common.sh@158 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:39.644 09:31:02 -- common/autotest_common.sh@160 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:39.644 09:31:02 -- common/autotest_common.sh@163 -- # : 00:07:39.644 09:31:02 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:39.644 09:31:02 -- common/autotest_common.sh@165 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:39.644 09:31:02 -- common/autotest_common.sh@167 -- # : 0 00:07:39.644 09:31:02 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:39.644 09:31:02 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:39.644 09:31:02 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:39.644 09:31:02 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:39.644 09:31:02 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:39.644 09:31:02 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:39.644 09:31:02 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:39.644 09:31:02 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:39.644 09:31:02 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:39.644 09:31:02 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:39.644 09:31:02 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:39.644 09:31:02 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:39.644 09:31:02 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:39.644 09:31:02 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:39.644 09:31:02 -- common/autotest_common.sh@196 -- # cat 00:07:39.644 09:31:02 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:39.644 09:31:02 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:39.644 09:31:02 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:39.644 09:31:02 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:39.644 09:31:02 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:39.644 09:31:02 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:39.644 09:31:02 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:39.644 09:31:02 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:39.644 09:31:02 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:39.645 09:31:02 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:39.645 09:31:02 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:39.645 09:31:02 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:39.645 09:31:02 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:39.645 09:31:02 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:39.645 09:31:02 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:39.645 09:31:02 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:39.645 09:31:02 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:39.645 09:31:02 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:39.645 09:31:02 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:39.645 09:31:02 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:39.645 09:31:02 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:39.645 09:31:02 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:39.645 09:31:02 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:39.645 09:31:02 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:39.645 09:31:02 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:39.645 09:31:02 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:39.645 09:31:02 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:39.645 09:31:02 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:39.645 09:31:02 -- common/autotest_common.sh@259 -- # valgrind= 00:07:39.645 09:31:02 -- common/autotest_common.sh@265 -- # uname -s 00:07:39.645 09:31:02 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:39.645 09:31:02 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:39.645 09:31:02 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:39.645 09:31:02 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:39.645 09:31:02 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:39.645 09:31:02 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:39.645 09:31:02 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:39.645 09:31:02 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:39.645 09:31:02 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:39.645 09:31:02 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:39.645 09:31:02 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:39.645 09:31:02 -- common/autotest_common.sh@319 -- # [[ -z 3190218 ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@319 -- # kill -0 3190218 00:07:39.645 09:31:02 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:39.645 09:31:02 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:39.645 09:31:02 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:39.645 09:31:02 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:39.645 09:31:02 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:39.645 09:31:02 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:39.645 09:31:02 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:39.645 09:31:02 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.kCXY7n 00:07:39.645 09:31:02 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:39.645 09:31:02 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.kCXY7n/tests/vfio /tmp/spdk.kCXY7n 00:07:39.645 09:31:02 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@328 -- # df -T 00:07:39.645 09:31:02 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=53287706624 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730607104 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=8442900480 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862708736 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865301504 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340129792 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346122240 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=5992448 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=30863421440 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865305600 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=1884160 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:39.645 09:31:02 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:39.645 09:31:02 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:39.645 09:31:02 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:39.645 09:31:02 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:39.645 * Looking for test storage... 00:07:39.645 09:31:02 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:39.645 09:31:02 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:39.645 09:31:02 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:39.645 09:31:02 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:39.645 09:31:02 -- common/autotest_common.sh@373 -- # mount=/ 00:07:39.645 09:31:02 -- common/autotest_common.sh@375 -- # target_space=53287706624 00:07:39.645 09:31:02 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:39.645 09:31:02 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:39.645 09:31:02 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@382 -- # new_size=10657492992 00:07:39.645 09:31:02 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:39.645 09:31:02 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:39.645 09:31:02 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:39.645 09:31:02 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:39.645 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:07:39.645 09:31:02 -- common/autotest_common.sh@390 -- # return 0 00:07:39.645 09:31:02 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:39.645 09:31:02 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:39.645 09:31:02 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:39.645 09:31:02 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:39.645 09:31:02 -- common/autotest_common.sh@1682 -- # true 00:07:39.645 09:31:02 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:39.645 09:31:02 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@27 -- # exec 00:07:39.645 09:31:02 -- common/autotest_common.sh@29 -- # exec 00:07:39.645 09:31:02 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:39.645 09:31:02 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:39.645 09:31:02 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:39.645 09:31:02 -- common/autotest_common.sh@18 -- # set -x 00:07:39.645 09:31:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:39.645 09:31:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:39.646 09:31:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:39.906 09:31:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:39.906 09:31:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:39.906 09:31:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:39.906 09:31:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:39.906 09:31:02 -- scripts/common.sh@335 -- # IFS=.-: 00:07:39.906 09:31:02 -- scripts/common.sh@335 -- # read -ra ver1 00:07:39.906 09:31:02 -- scripts/common.sh@336 -- # IFS=.-: 00:07:39.906 09:31:02 -- scripts/common.sh@336 -- # read -ra ver2 00:07:39.906 09:31:02 -- scripts/common.sh@337 -- # local 'op=<' 00:07:39.906 09:31:02 -- scripts/common.sh@339 -- # ver1_l=2 00:07:39.906 09:31:02 -- scripts/common.sh@340 -- # ver2_l=1 00:07:39.906 09:31:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:39.906 09:31:02 -- scripts/common.sh@343 -- # case "$op" in 00:07:39.906 09:31:02 -- scripts/common.sh@344 -- # : 1 00:07:39.906 09:31:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:39.906 09:31:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:39.906 09:31:02 -- scripts/common.sh@364 -- # decimal 1 00:07:39.906 09:31:02 -- scripts/common.sh@352 -- # local d=1 00:07:39.906 09:31:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:39.906 09:31:02 -- scripts/common.sh@354 -- # echo 1 00:07:39.906 09:31:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:39.906 09:31:02 -- scripts/common.sh@365 -- # decimal 2 00:07:39.906 09:31:02 -- scripts/common.sh@352 -- # local d=2 00:07:39.906 09:31:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:39.906 09:31:02 -- scripts/common.sh@354 -- # echo 2 00:07:39.906 09:31:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:39.906 09:31:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:39.906 09:31:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:39.906 09:31:02 -- scripts/common.sh@367 -- # return 0 00:07:39.906 09:31:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:39.906 09:31:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:39.906 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.906 --rc genhtml_branch_coverage=1 00:07:39.906 --rc genhtml_function_coverage=1 00:07:39.906 --rc genhtml_legend=1 00:07:39.906 --rc geninfo_all_blocks=1 00:07:39.906 --rc geninfo_unexecuted_blocks=1 00:07:39.906 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.906 ' 00:07:39.906 09:31:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:39.906 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.906 --rc genhtml_branch_coverage=1 00:07:39.906 --rc genhtml_function_coverage=1 00:07:39.906 --rc genhtml_legend=1 00:07:39.906 --rc geninfo_all_blocks=1 00:07:39.906 --rc geninfo_unexecuted_blocks=1 00:07:39.906 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.906 ' 00:07:39.906 09:31:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:39.906 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.906 --rc genhtml_branch_coverage=1 00:07:39.906 --rc genhtml_function_coverage=1 00:07:39.906 --rc genhtml_legend=1 00:07:39.906 --rc geninfo_all_blocks=1 00:07:39.906 --rc geninfo_unexecuted_blocks=1 00:07:39.906 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.906 ' 00:07:39.906 09:31:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:39.906 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.906 --rc genhtml_branch_coverage=1 00:07:39.906 --rc genhtml_function_coverage=1 00:07:39.906 --rc genhtml_legend=1 00:07:39.906 --rc geninfo_all_blocks=1 00:07:39.906 --rc geninfo_unexecuted_blocks=1 00:07:39.906 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.906 ' 00:07:39.906 09:31:02 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:07:39.906 09:31:02 -- ../common.sh@8 -- # pids=() 00:07:39.906 09:31:02 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:39.906 09:31:02 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:07:39.906 09:31:02 -- vfio/run.sh@59 -- # fuzz_num=7 00:07:39.906 09:31:02 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:07:39.906 09:31:02 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:07:39.906 09:31:02 -- vfio/run.sh@65 -- # mem_size=0 00:07:39.906 09:31:02 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:07:39.906 09:31:02 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:07:39.906 09:31:02 -- ../common.sh@69 -- # local fuzz_num=7 00:07:39.906 09:31:02 -- ../common.sh@70 -- # local time=1 00:07:39.906 09:31:02 -- ../common.sh@72 -- # (( i = 0 )) 00:07:39.906 09:31:02 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.906 09:31:02 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:39.906 09:31:02 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:07:39.906 09:31:02 -- vfio/run.sh@23 -- # local timen=1 00:07:39.906 09:31:02 -- vfio/run.sh@24 -- # local core=0x1 00:07:39.906 09:31:02 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:39.906 09:31:02 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:07:39.906 09:31:02 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:07:39.906 09:31:02 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:07:39.906 09:31:02 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:07:39.906 09:31:02 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:39.906 09:31:02 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:07:39.906 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:39.906 09:31:02 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:07:39.906 [2024-11-29 09:31:02.579394] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:39.906 [2024-11-29 09:31:02.579467] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3190449 ] 00:07:39.906 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.906 [2024-11-29 09:31:02.651407] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.906 [2024-11-29 09:31:02.725544] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:39.906 [2024-11-29 09:31:02.725709] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.166 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.166 INFO: Seed: 1234030805 00:07:40.166 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:40.166 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:40.166 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:07:40.166 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.166 #2 INITED exec/s: 0 rss: 60Mb 00:07:40.166 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.166 This may also happen if the target rejected all inputs we tried so far 00:07:40.684 NEW_FUNC[1/631]: 0x43a218 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:07:40.684 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:40.684 #7 NEW cov: 10765 ft: 10731 corp: 2/60b lim: 60 exec/s: 0 rss: 67Mb L: 59/59 MS: 5 InsertRepeatedBytes-InsertByte-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:40.943 #9 NEW cov: 10779 ft: 13244 corp: 3/71b lim: 60 exec/s: 0 rss: 68Mb L: 11/59 MS: 2 ShuffleBytes-CrossOver- 00:07:41.202 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.202 #10 NEW cov: 10796 ft: 13983 corp: 4/130b lim: 60 exec/s: 0 rss: 69Mb L: 59/59 MS: 1 ChangeASCIIInt- 00:07:41.202 #11 NEW cov: 10796 ft: 15385 corp: 5/189b lim: 60 exec/s: 11 rss: 69Mb L: 59/59 MS: 1 ChangeBit- 00:07:41.460 #12 NEW cov: 10796 ft: 15994 corp: 6/248b lim: 60 exec/s: 12 rss: 70Mb L: 59/59 MS: 1 ShuffleBytes- 00:07:41.719 #13 NEW cov: 10796 ft: 16218 corp: 7/307b lim: 60 exec/s: 13 rss: 70Mb L: 59/59 MS: 1 CopyPart- 00:07:41.978 #14 NEW cov: 10796 ft: 16550 corp: 8/366b lim: 60 exec/s: 14 rss: 70Mb L: 59/59 MS: 1 ChangeByte- 00:07:41.978 #15 NEW cov: 10803 ft: 16729 corp: 9/425b lim: 60 exec/s: 15 rss: 70Mb L: 59/59 MS: 1 ChangeBit- 00:07:42.236 #19 NEW cov: 10803 ft: 16890 corp: 10/436b lim: 60 exec/s: 9 rss: 70Mb L: 11/59 MS: 4 ShuffleBytes-ChangeByte-InsertByte-CrossOver- 00:07:42.236 #19 DONE cov: 10803 ft: 16890 corp: 10/436b lim: 60 exec/s: 9 rss: 70Mb 00:07:42.236 Done 19 runs in 2 second(s) 00:07:42.495 09:31:05 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:07:42.495 09:31:05 -- ../common.sh@72 -- # (( i++ )) 00:07:42.495 09:31:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.495 09:31:05 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:42.495 09:31:05 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:07:42.495 09:31:05 -- vfio/run.sh@23 -- # local timen=1 00:07:42.495 09:31:05 -- vfio/run.sh@24 -- # local core=0x1 00:07:42.495 09:31:05 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:42.495 09:31:05 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:07:42.495 09:31:05 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:07:42.495 09:31:05 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:07:42.495 09:31:05 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:07:42.495 09:31:05 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:42.495 09:31:05 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:07:42.495 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:42.495 09:31:05 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:07:42.495 [2024-11-29 09:31:05.328791] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:42.495 [2024-11-29 09:31:05.328885] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3190838 ] 00:07:42.755 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.755 [2024-11-29 09:31:05.401063] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.755 [2024-11-29 09:31:05.471560] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.755 [2024-11-29 09:31:05.471721] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.020 INFO: Running with entropic power schedule (0xFF, 100). 00:07:43.020 INFO: Seed: 3979015828 00:07:43.020 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:43.020 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:43.020 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:07:43.020 INFO: A corpus is not provided, starting from an empty corpus 00:07:43.020 #2 INITED exec/s: 0 rss: 61Mb 00:07:43.020 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:43.020 This may also happen if the target rejected all inputs we tried so far 00:07:43.020 [2024-11-29 09:31:05.775637] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:43.020 [2024-11-29 09:31:05.775681] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:43.020 [2024-11-29 09:31:05.775699] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:43.343 NEW_FUNC[1/638]: 0x43a7b8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:07:43.343 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:43.343 #3 NEW cov: 10779 ft: 10467 corp: 2/30b lim: 40 exec/s: 0 rss: 67Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:07:43.628 [2024-11-29 09:31:06.247045] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:43.628 [2024-11-29 09:31:06.247079] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:43.628 [2024-11-29 09:31:06.247098] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:43.628 #4 NEW cov: 10793 ft: 13195 corp: 3/53b lim: 40 exec/s: 0 rss: 68Mb L: 23/29 MS: 1 EraseBytes- 00:07:43.628 [2024-11-29 09:31:06.430976] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:43.628 [2024-11-29 09:31:06.430998] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:43.628 [2024-11-29 09:31:06.431017] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:43.894 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.894 #5 NEW cov: 10810 ft: 14821 corp: 4/76b lim: 40 exec/s: 0 rss: 69Mb L: 23/29 MS: 1 ChangeASCIIInt- 00:07:43.894 [2024-11-29 09:31:06.619451] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:43.894 [2024-11-29 09:31:06.619474] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:43.894 [2024-11-29 09:31:06.619491] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:43.894 #6 NEW cov: 10813 ft: 15508 corp: 5/105b lim: 40 exec/s: 6 rss: 69Mb L: 29/29 MS: 1 CrossOver- 00:07:44.153 [2024-11-29 09:31:06.804075] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:44.153 [2024-11-29 09:31:06.804098] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:44.153 [2024-11-29 09:31:06.804115] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:44.153 #7 NEW cov: 10813 ft: 15631 corp: 6/128b lim: 40 exec/s: 7 rss: 69Mb L: 23/29 MS: 1 ChangeASCIIInt- 00:07:44.153 [2024-11-29 09:31:06.985884] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:44.153 [2024-11-29 09:31:06.985907] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:44.153 [2024-11-29 09:31:06.985926] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:44.412 #8 NEW cov: 10813 ft: 15846 corp: 7/157b lim: 40 exec/s: 8 rss: 69Mb L: 29/29 MS: 1 ChangeBit- 00:07:44.413 [2024-11-29 09:31:07.168665] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:44.413 [2024-11-29 09:31:07.168688] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:44.413 [2024-11-29 09:31:07.168704] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:44.672 #9 NEW cov: 10813 ft: 15905 corp: 8/186b lim: 40 exec/s: 9 rss: 69Mb L: 29/29 MS: 1 ChangeASCIIInt- 00:07:44.672 [2024-11-29 09:31:07.354035] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:44.672 [2024-11-29 09:31:07.354057] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:44.672 [2024-11-29 09:31:07.354073] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:44.672 #10 NEW cov: 10813 ft: 16353 corp: 9/193b lim: 40 exec/s: 10 rss: 69Mb L: 7/29 MS: 1 InsertRepeatedBytes- 00:07:44.961 [2024-11-29 09:31:07.548539] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:44.961 [2024-11-29 09:31:07.548561] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:44.961 [2024-11-29 09:31:07.548577] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:44.961 #11 NEW cov: 10820 ft: 16714 corp: 10/216b lim: 40 exec/s: 11 rss: 69Mb L: 23/29 MS: 1 ChangeByte- 00:07:44.961 [2024-11-29 09:31:07.731214] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:07:44.961 [2024-11-29 09:31:07.731235] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:07:44.961 [2024-11-29 09:31:07.731253] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:07:45.221 #12 NEW cov: 10820 ft: 16866 corp: 11/248b lim: 40 exec/s: 6 rss: 69Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:45.221 #12 DONE cov: 10820 ft: 16866 corp: 11/248b lim: 40 exec/s: 6 rss: 69Mb 00:07:45.221 Done 12 runs in 2 second(s) 00:07:45.482 09:31:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:07:45.482 09:31:08 -- ../common.sh@72 -- # (( i++ )) 00:07:45.482 09:31:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:45.482 09:31:08 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:45.482 09:31:08 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:07:45.482 09:31:08 -- vfio/run.sh@23 -- # local timen=1 00:07:45.482 09:31:08 -- vfio/run.sh@24 -- # local core=0x1 00:07:45.482 09:31:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:45.482 09:31:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:07:45.482 09:31:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:07:45.482 09:31:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:07:45.482 09:31:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:07:45.482 09:31:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:45.482 09:31:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:07:45.482 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:45.482 09:31:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:07:45.482 [2024-11-29 09:31:08.143147] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:45.482 [2024-11-29 09:31:08.143215] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3191375 ] 00:07:45.482 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.482 [2024-11-29 09:31:08.213905] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.482 [2024-11-29 09:31:08.283299] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.482 [2024-11-29 09:31:08.283458] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.741 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.741 INFO: Seed: 2496070307 00:07:45.741 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:45.741 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:45.741 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:07:45.741 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.741 #2 INITED exec/s: 0 rss: 62Mb 00:07:45.741 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:45.741 This may also happen if the target rejected all inputs we tried so far 00:07:46.001 [2024-11-29 09:31:08.613279] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:46.261 NEW_FUNC[1/631]: 0x43b1a8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:07:46.261 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:46.261 #8 NEW cov: 10716 ft: 10716 corp: 2/34b lim: 80 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:46.521 [2024-11-29 09:31:09.104627] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:46.521 NEW_FUNC[1/1]: 0x10aaf28 in spdk_nvmf_request_exec /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:4585 00:07:46.521 #14 NEW cov: 10761 ft: 14183 corp: 3/59b lim: 80 exec/s: 0 rss: 69Mb L: 25/33 MS: 1 InsertRepeatedBytes- 00:07:46.521 [2024-11-29 09:31:09.320143] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:46.780 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.780 #15 NEW cov: 10778 ft: 15209 corp: 4/92b lim: 80 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 ChangeByte- 00:07:46.780 [2024-11-29 09:31:09.509677] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:46.780 #16 NEW cov: 10778 ft: 16131 corp: 5/119b lim: 80 exec/s: 16 rss: 70Mb L: 27/33 MS: 1 EraseBytes- 00:07:47.039 [2024-11-29 09:31:09.700334] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:47.039 #17 NEW cov: 10778 ft: 16395 corp: 6/144b lim: 80 exec/s: 17 rss: 70Mb L: 25/33 MS: 1 ChangeBinInt- 00:07:47.298 [2024-11-29 09:31:09.886501] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:47.298 #23 NEW cov: 10778 ft: 16756 corp: 7/169b lim: 80 exec/s: 23 rss: 70Mb L: 25/33 MS: 1 ChangeBinInt- 00:07:47.298 [2024-11-29 09:31:10.073498] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: cmd 5 failed: Invalid argument 00:07:47.298 [2024-11-29 09:31:10.073544] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:47.558 NEW_FUNC[1/2]: 0x13176f8 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:07:47.558 NEW_FUNC[2/2]: 0x1317998 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:07:47.558 #24 NEW cov: 10791 ft: 17337 corp: 8/190b lim: 80 exec/s: 24 rss: 70Mb L: 21/33 MS: 1 EraseBytes- 00:07:47.558 [2024-11-29 09:31:10.272760] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:07:47.558 #25 NEW cov: 10798 ft: 17779 corp: 9/217b lim: 80 exec/s: 25 rss: 70Mb L: 27/33 MS: 1 ShuffleBytes- 00:07:47.817 [2024-11-29 09:31:10.459717] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-2/domain/1: msg0: cmd 5 failed: Invalid argument 00:07:47.817 [2024-11-29 09:31:10.459750] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 5 return failure 00:07:47.817 #26 NEW cov: 10798 ft: 18211 corp: 10/236b lim: 80 exec/s: 13 rss: 70Mb L: 19/33 MS: 1 EraseBytes- 00:07:47.817 #26 DONE cov: 10798 ft: 18211 corp: 10/236b lim: 80 exec/s: 13 rss: 70Mb 00:07:47.817 Done 26 runs in 2 second(s) 00:07:48.077 09:31:10 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:07:48.077 09:31:10 -- ../common.sh@72 -- # (( i++ )) 00:07:48.077 09:31:10 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.077 09:31:10 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:48.077 09:31:10 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:07:48.077 09:31:10 -- vfio/run.sh@23 -- # local timen=1 00:07:48.077 09:31:10 -- vfio/run.sh@24 -- # local core=0x1 00:07:48.077 09:31:10 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:48.077 09:31:10 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:07:48.077 09:31:10 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:07:48.077 09:31:10 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:07:48.077 09:31:10 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:07:48.077 09:31:10 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:48.077 09:31:10 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:07:48.077 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:48.077 09:31:10 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:07:48.077 [2024-11-29 09:31:10.881927] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:48.077 [2024-11-29 09:31:10.882022] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3191919 ] 00:07:48.077 EAL: No free 2048 kB hugepages reported on node 1 00:07:48.337 [2024-11-29 09:31:10.954967] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:48.337 [2024-11-29 09:31:11.023256] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:48.337 [2024-11-29 09:31:11.023412] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.596 INFO: Running with entropic power schedule (0xFF, 100). 00:07:48.596 INFO: Seed: 933090730 00:07:48.596 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:48.596 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:48.596 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:07:48.596 INFO: A corpus is not provided, starting from an empty corpus 00:07:48.596 #2 INITED exec/s: 0 rss: 62Mb 00:07:48.596 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:48.596 This may also happen if the target rejected all inputs we tried so far 00:07:48.856 NEW_FUNC[1/632]: 0x43b898 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:07:48.856 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:48.856 #11 NEW cov: 10752 ft: 10700 corp: 2/84b lim: 320 exec/s: 0 rss: 67Mb L: 83/83 MS: 4 ShuffleBytes-CrossOver-CopyPart-InsertRepeatedBytes- 00:07:49.115 #27 NEW cov: 10766 ft: 13561 corp: 3/290b lim: 320 exec/s: 0 rss: 69Mb L: 206/206 MS: 1 InsertRepeatedBytes- 00:07:49.115 #34 NEW cov: 10766 ft: 14483 corp: 4/420b lim: 320 exec/s: 0 rss: 70Mb L: 130/206 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:49.374 #35 NEW cov: 10766 ft: 14862 corp: 5/558b lim: 320 exec/s: 0 rss: 70Mb L: 138/206 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\014"- 00:07:49.374 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:49.374 #41 NEW cov: 10783 ft: 15301 corp: 6/688b lim: 320 exec/s: 0 rss: 70Mb L: 130/206 MS: 1 ShuffleBytes- 00:07:49.633 #42 NEW cov: 10783 ft: 15679 corp: 7/818b lim: 320 exec/s: 42 rss: 70Mb L: 130/206 MS: 1 ShuffleBytes- 00:07:49.633 #43 NEW cov: 10783 ft: 15795 corp: 8/901b lim: 320 exec/s: 43 rss: 70Mb L: 83/206 MS: 1 ShuffleBytes- 00:07:49.633 #44 NEW cov: 10783 ft: 16003 corp: 9/984b lim: 320 exec/s: 44 rss: 70Mb L: 83/206 MS: 1 ShuffleBytes- 00:07:49.896 #45 NEW cov: 10783 ft: 16216 corp: 10/1244b lim: 320 exec/s: 45 rss: 70Mb L: 260/260 MS: 1 CrossOver- 00:07:49.896 #46 NEW cov: 10783 ft: 16505 corp: 11/1327b lim: 320 exec/s: 46 rss: 70Mb L: 83/260 MS: 1 ChangeByte- 00:07:50.154 #47 NEW cov: 10783 ft: 17571 corp: 12/1410b lim: 320 exec/s: 47 rss: 70Mb L: 83/260 MS: 1 ShuffleBytes- 00:07:50.412 #48 NEW cov: 10783 ft: 17945 corp: 13/1493b lim: 320 exec/s: 48 rss: 70Mb L: 83/260 MS: 1 ChangeBinInt- 00:07:50.412 #49 NEW cov: 10790 ft: 18065 corp: 14/1623b lim: 320 exec/s: 49 rss: 70Mb L: 130/260 MS: 1 CMP- DE: "\000\000\000\026"- 00:07:50.671 #50 NEW cov: 10790 ft: 18155 corp: 15/1818b lim: 320 exec/s: 25 rss: 70Mb L: 195/260 MS: 1 InsertRepeatedBytes- 00:07:50.671 #50 DONE cov: 10790 ft: 18155 corp: 15/1818b lim: 320 exec/s: 25 rss: 70Mb 00:07:50.671 ###### Recommended dictionary. ###### 00:07:50.671 "\377\377\377\377\377\377\377\014" # Uses: 0 00:07:50.671 "\000\000\000\026" # Uses: 0 00:07:50.671 ###### End of recommended dictionary. ###### 00:07:50.671 Done 50 runs in 2 second(s) 00:07:50.931 09:31:13 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:07:50.931 09:31:13 -- ../common.sh@72 -- # (( i++ )) 00:07:50.931 09:31:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:50.931 09:31:13 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:50.931 09:31:13 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:07:50.931 09:31:13 -- vfio/run.sh@23 -- # local timen=1 00:07:50.931 09:31:13 -- vfio/run.sh@24 -- # local core=0x1 00:07:50.931 09:31:13 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:50.931 09:31:13 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:07:50.931 09:31:13 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:07:50.931 09:31:13 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:07:50.931 09:31:13 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:07:50.931 09:31:13 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:50.931 09:31:13 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:07:50.931 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:50.931 09:31:13 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:07:50.931 [2024-11-29 09:31:13.722813] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:50.931 [2024-11-29 09:31:13.722901] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3192462 ] 00:07:50.931 EAL: No free 2048 kB hugepages reported on node 1 00:07:51.190 [2024-11-29 09:31:13.795948] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.190 [2024-11-29 09:31:13.865896] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:51.190 [2024-11-29 09:31:13.866037] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.449 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.449 INFO: Seed: 3777779682 00:07:51.449 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:51.449 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:51.449 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:07:51.449 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.449 #2 INITED exec/s: 0 rss: 61Mb 00:07:51.449 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.449 This may also happen if the target rejected all inputs we tried so far 00:07:51.967 NEW_FUNC[1/620]: 0x43c118 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:07:51.967 NEW_FUNC[2/620]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:51.967 #11 NEW cov: 10605 ft: 10715 corp: 2/101b lim: 320 exec/s: 0 rss: 67Mb L: 100/100 MS: 4 InsertRepeatedBytes-EraseBytes-ChangeBinInt-InsertRepeatedBytes- 00:07:51.967 NEW_FUNC[1/12]: 0x13537c8 in handle_cmd_rsp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:2498 00:07:51.967 NEW_FUNC[2/12]: 0x15d3c58 in _is_io_flags_valid /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_ns_cmd.c:141 00:07:51.967 #12 NEW cov: 10761 ft: 13349 corp: 3/201b lim: 320 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeBinInt- 00:07:52.225 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:52.226 #13 NEW cov: 10778 ft: 13785 corp: 4/301b lim: 320 exec/s: 0 rss: 70Mb L: 100/100 MS: 1 ChangeBit- 00:07:52.484 #14 NEW cov: 10778 ft: 14818 corp: 5/402b lim: 320 exec/s: 14 rss: 70Mb L: 101/101 MS: 1 InsertByte- 00:07:52.484 #15 NEW cov: 10781 ft: 15550 corp: 6/458b lim: 320 exec/s: 15 rss: 70Mb L: 56/101 MS: 1 EraseBytes- 00:07:52.742 #16 NEW cov: 10781 ft: 15795 corp: 7/558b lim: 320 exec/s: 16 rss: 70Mb L: 100/101 MS: 1 ChangeByte- 00:07:53.000 #17 NEW cov: 10781 ft: 16116 corp: 8/750b lim: 320 exec/s: 17 rss: 70Mb L: 192/192 MS: 1 InsertRepeatedBytes- 00:07:53.260 #18 NEW cov: 10788 ft: 16270 corp: 9/806b lim: 320 exec/s: 18 rss: 70Mb L: 56/192 MS: 1 CrossOver- 00:07:53.260 [2024-11-29 09:31:15.952430] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to memory map DMA region [(nil), (nil)) fd=325 offset=0 prot=0x3: Invalid argument 00:07:53.260 [2024-11-29 09:31:15.952470] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: failed to add DMA region [0, 0) offset=0 flags=0x3: Invalid argument 00:07:53.260 [2024-11-29 09:31:15.952481] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 2 failed: Invalid argument 00:07:53.260 [2024-11-29 09:31:15.952497] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:07:53.260 [2024-11-29 09:31:15.953436] vfio_user.c:3094:vfio_user_log: *WARNING*: /tmp/vfio-user-4/domain/1: failed to remove DMA region [0, 0) flags=0: No such file or directory 00:07:53.260 [2024-11-29 09:31:15.953455] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-4/domain/1: msg0: cmd 3 failed: No such file or directory 00:07:53.260 [2024-11-29 09:31:15.953470] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 3 return failure 00:07:53.260 NEW_FUNC[1/6]: 0x13176f8 in endpoint_id /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:638 00:07:53.260 NEW_FUNC[2/6]: 0x1317998 in vfio_user_log /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/vfio_user.c:3084 00:07:53.260 #23 NEW cov: 10822 ft: 16443 corp: 10/872b lim: 320 exec/s: 23 rss: 70Mb L: 66/192 MS: 5 ChangeByte-ShuffleBytes-ChangeByte-ChangeByte-InsertRepeatedBytes- 00:07:53.519 #24 NEW cov: 10822 ft: 16737 corp: 11/928b lim: 320 exec/s: 12 rss: 70Mb L: 56/192 MS: 1 ChangeBit- 00:07:53.519 #24 DONE cov: 10822 ft: 16737 corp: 11/928b lim: 320 exec/s: 12 rss: 70Mb 00:07:53.519 Done 24 runs in 2 second(s) 00:07:53.779 09:31:16 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:07:53.779 09:31:16 -- ../common.sh@72 -- # (( i++ )) 00:07:53.779 09:31:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:53.779 09:31:16 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:53.779 09:31:16 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:07:53.779 09:31:16 -- vfio/run.sh@23 -- # local timen=1 00:07:53.779 09:31:16 -- vfio/run.sh@24 -- # local core=0x1 00:07:53.779 09:31:16 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:53.779 09:31:16 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:07:53.779 09:31:16 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:07:53.779 09:31:16 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:07:53.779 09:31:16 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:07:53.779 09:31:16 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:53.779 09:31:16 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:07:53.779 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:53.779 09:31:16 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:07:53.779 [2024-11-29 09:31:16.565220] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:53.779 [2024-11-29 09:31:16.565301] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3193011 ] 00:07:53.779 EAL: No free 2048 kB hugepages reported on node 1 00:07:54.038 [2024-11-29 09:31:16.636347] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.038 [2024-11-29 09:31:16.705253] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:54.038 [2024-11-29 09:31:16.705392] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.297 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.297 INFO: Seed: 2321126045 00:07:54.297 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:54.297 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:54.297 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:07:54.297 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.297 #2 INITED exec/s: 0 rss: 62Mb 00:07:54.297 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.297 This may also happen if the target rejected all inputs we tried so far 00:07:54.297 [2024-11-29 09:31:16.963681] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.297 [2024-11-29 09:31:16.963728] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.556 NEW_FUNC[1/638]: 0x43cb18 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:07:54.556 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:54.556 #4 NEW cov: 10781 ft: 10557 corp: 2/38b lim: 120 exec/s: 0 rss: 67Mb L: 37/37 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:54.556 [2024-11-29 09:31:17.365468] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.556 [2024-11-29 09:31:17.365518] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.815 #10 NEW cov: 10795 ft: 13083 corp: 3/75b lim: 120 exec/s: 0 rss: 69Mb L: 37/37 MS: 1 ChangeBinInt- 00:07:54.815 [2024-11-29 09:31:17.480492] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.815 [2024-11-29 09:31:17.480528] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:54.815 #16 NEW cov: 10795 ft: 13374 corp: 4/112b lim: 120 exec/s: 0 rss: 70Mb L: 37/37 MS: 1 ChangeBinInt- 00:07:54.815 [2024-11-29 09:31:17.595316] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:54.815 [2024-11-29 09:31:17.595357] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.074 #17 NEW cov: 10798 ft: 13652 corp: 5/150b lim: 120 exec/s: 0 rss: 70Mb L: 38/38 MS: 1 InsertByte- 00:07:55.074 [2024-11-29 09:31:17.709097] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.074 [2024-11-29 09:31:17.709132] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.074 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:55.074 #18 NEW cov: 10815 ft: 14858 corp: 6/187b lim: 120 exec/s: 0 rss: 70Mb L: 37/38 MS: 1 ChangeBinInt- 00:07:55.074 [2024-11-29 09:31:17.823987] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.074 [2024-11-29 09:31:17.824022] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.074 #19 NEW cov: 10815 ft: 15485 corp: 7/225b lim: 120 exec/s: 0 rss: 70Mb L: 38/38 MS: 1 InsertByte- 00:07:55.333 [2024-11-29 09:31:17.937774] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.333 [2024-11-29 09:31:17.937807] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.333 #25 NEW cov: 10815 ft: 15789 corp: 8/278b lim: 120 exec/s: 25 rss: 70Mb L: 53/53 MS: 1 CopyPart- 00:07:55.333 [2024-11-29 09:31:18.051528] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.333 [2024-11-29 09:31:18.051560] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.333 #26 NEW cov: 10815 ft: 15869 corp: 9/345b lim: 120 exec/s: 26 rss: 70Mb L: 67/67 MS: 1 InsertRepeatedBytes- 00:07:55.333 [2024-11-29 09:31:18.165503] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.333 [2024-11-29 09:31:18.165536] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.593 #27 NEW cov: 10815 ft: 15930 corp: 10/382b lim: 120 exec/s: 27 rss: 70Mb L: 37/67 MS: 1 ChangeBit- 00:07:55.593 [2024-11-29 09:31:18.269198] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.593 [2024-11-29 09:31:18.269232] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.593 #28 NEW cov: 10815 ft: 16393 corp: 11/443b lim: 120 exec/s: 28 rss: 70Mb L: 61/67 MS: 1 InsertRepeatedBytes- 00:07:55.593 [2024-11-29 09:31:18.392140] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.593 [2024-11-29 09:31:18.392172] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.852 #29 NEW cov: 10815 ft: 16479 corp: 12/549b lim: 120 exec/s: 29 rss: 70Mb L: 106/106 MS: 1 InsertRepeatedBytes- 00:07:55.852 [2024-11-29 09:31:18.507022] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.852 [2024-11-29 09:31:18.507055] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.852 #30 NEW cov: 10815 ft: 16569 corp: 13/591b lim: 120 exec/s: 30 rss: 70Mb L: 42/106 MS: 1 CopyPart- 00:07:55.852 [2024-11-29 09:31:18.620860] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:55.852 [2024-11-29 09:31:18.620894] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:55.852 #31 NEW cov: 10815 ft: 16614 corp: 14/628b lim: 120 exec/s: 31 rss: 70Mb L: 37/106 MS: 1 ChangeBit- 00:07:56.111 [2024-11-29 09:31:18.735792] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:56.111 [2024-11-29 09:31:18.735832] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:56.111 #32 NEW cov: 10822 ft: 16657 corp: 15/689b lim: 120 exec/s: 32 rss: 70Mb L: 61/106 MS: 1 ChangeBit- 00:07:56.111 [2024-11-29 09:31:18.849766] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:56.111 [2024-11-29 09:31:18.849799] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:56.111 #33 NEW cov: 10822 ft: 16693 corp: 16/727b lim: 120 exec/s: 16 rss: 70Mb L: 38/106 MS: 1 ShuffleBytes- 00:07:56.111 #33 DONE cov: 10822 ft: 16693 corp: 16/727b lim: 120 exec/s: 16 rss: 70Mb 00:07:56.111 Done 33 runs in 2 second(s) 00:07:56.371 09:31:19 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:07:56.371 09:31:19 -- ../common.sh@72 -- # (( i++ )) 00:07:56.371 09:31:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.371 09:31:19 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:56.371 09:31:19 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:07:56.371 09:31:19 -- vfio/run.sh@23 -- # local timen=1 00:07:56.371 09:31:19 -- vfio/run.sh@24 -- # local core=0x1 00:07:56.371 09:31:19 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:56.371 09:31:19 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:07:56.371 09:31:19 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:07:56.371 09:31:19 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:07:56.371 09:31:19 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:07:56.371 09:31:19 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:56.371 09:31:19 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:07:56.371 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:07:56.371 09:31:19 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:07:56.630 [2024-11-29 09:31:19.220307] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:56.630 [2024-11-29 09:31:19.220375] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3193379 ] 00:07:56.630 EAL: No free 2048 kB hugepages reported on node 1 00:07:56.630 [2024-11-29 09:31:19.295784] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:56.630 [2024-11-29 09:31:19.365035] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:56.631 [2024-11-29 09:31:19.365198] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:56.890 INFO: Running with entropic power schedule (0xFF, 100). 00:07:56.890 INFO: Seed: 686147630 00:07:56.890 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:07:56.890 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:07:56.890 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:07:56.890 INFO: A corpus is not provided, starting from an empty corpus 00:07:56.890 #2 INITED exec/s: 0 rss: 62Mb 00:07:56.890 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:56.890 This may also happen if the target rejected all inputs we tried so far 00:07:56.890 [2024-11-29 09:31:19.641641] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:56.890 [2024-11-29 09:31:19.641681] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.408 NEW_FUNC[1/638]: 0x43d808 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:07:57.408 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:07:57.408 #5 NEW cov: 10772 ft: 10732 corp: 2/71b lim: 90 exec/s: 0 rss: 68Mb L: 70/70 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:07:57.408 [2024-11-29 09:31:20.149102] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.408 [2024-11-29 09:31:20.149150] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.667 #6 NEW cov: 10786 ft: 14162 corp: 3/80b lim: 90 exec/s: 0 rss: 69Mb L: 9/70 MS: 1 CMP- DE: "\000\000\000\000\005a\011h"- 00:07:57.667 [2024-11-29 09:31:20.365404] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.667 [2024-11-29 09:31:20.365436] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.667 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:57.667 #7 NEW cov: 10803 ft: 15576 corp: 4/89b lim: 90 exec/s: 0 rss: 70Mb L: 9/70 MS: 1 CopyPart- 00:07:57.926 [2024-11-29 09:31:20.574588] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:57.926 [2024-11-29 09:31:20.574644] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:57.926 #8 NEW cov: 10803 ft: 15863 corp: 5/98b lim: 90 exec/s: 8 rss: 70Mb L: 9/70 MS: 1 ChangeBit- 00:07:58.185 [2024-11-29 09:31:20.771991] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.185 [2024-11-29 09:31:20.772022] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.185 #9 NEW cov: 10803 ft: 16423 corp: 6/133b lim: 90 exec/s: 9 rss: 71Mb L: 35/70 MS: 1 EraseBytes- 00:07:58.185 [2024-11-29 09:31:20.976531] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.185 [2024-11-29 09:31:20.976562] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.444 #15 NEW cov: 10803 ft: 16977 corp: 7/208b lim: 90 exec/s: 15 rss: 71Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:07:58.444 [2024-11-29 09:31:21.179020] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.444 [2024-11-29 09:31:21.179052] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.703 #16 NEW cov: 10803 ft: 17110 corp: 8/291b lim: 90 exec/s: 16 rss: 71Mb L: 83/83 MS: 1 CMP- DE: "\377\222\317\361\244=z\252"- 00:07:58.703 [2024-11-29 09:31:21.378294] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.703 [2024-11-29 09:31:21.378323] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.703 #18 NEW cov: 10810 ft: 17240 corp: 9/300b lim: 90 exec/s: 18 rss: 71Mb L: 9/83 MS: 2 ChangeBit-CMP- DE: "j\250\241\005\000\000\000\000"- 00:07:58.962 [2024-11-29 09:31:21.595024] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:07:58.962 [2024-11-29 09:31:21.595053] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:07:58.962 #19 NEW cov: 10810 ft: 17370 corp: 10/309b lim: 90 exec/s: 9 rss: 71Mb L: 9/83 MS: 1 ChangeBit- 00:07:58.962 #19 DONE cov: 10810 ft: 17370 corp: 10/309b lim: 90 exec/s: 9 rss: 71Mb 00:07:58.962 ###### Recommended dictionary. ###### 00:07:58.962 "\000\000\000\000\005a\011h" # Uses: 0 00:07:58.962 "\377\222\317\361\244=z\252" # Uses: 0 00:07:58.962 "j\250\241\005\000\000\000\000" # Uses: 0 00:07:58.962 ###### End of recommended dictionary. ###### 00:07:58.962 Done 19 runs in 2 second(s) 00:07:59.222 09:31:21 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:07:59.222 09:31:21 -- ../common.sh@72 -- # (( i++ )) 00:07:59.222 09:31:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.222 09:31:21 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:07:59.222 00:07:59.222 real 0m19.880s 00:07:59.222 user 0m27.616s 00:07:59.222 sys 0m1.891s 00:07:59.222 09:31:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:59.222 09:31:21 -- common/autotest_common.sh@10 -- # set +x 00:07:59.222 ************************************ 00:07:59.222 END TEST vfio_fuzz 00:07:59.222 ************************************ 00:07:59.222 00:07:59.222 real 1m25.464s 00:07:59.222 user 2m8.521s 00:07:59.222 sys 0m10.164s 00:07:59.222 09:31:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:59.222 09:31:22 -- common/autotest_common.sh@10 -- # set +x 00:07:59.222 ************************************ 00:07:59.222 END TEST llvm_fuzz 00:07:59.222 ************************************ 00:07:59.222 09:31:22 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:07:59.222 09:31:22 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:07:59.222 09:31:22 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:07:59.222 09:31:22 -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:59.222 09:31:22 -- common/autotest_common.sh@10 -- # set +x 00:07:59.481 09:31:22 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:07:59.481 09:31:22 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:07:59.481 09:31:22 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:07:59.481 09:31:22 -- common/autotest_common.sh@10 -- # set +x 00:08:06.059 INFO: APP EXITING 00:08:06.059 INFO: killing all VMs 00:08:06.059 INFO: killing vhost app 00:08:06.059 INFO: EXIT DONE 00:08:09.356 Waiting for block devices as requested 00:08:09.356 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:09.356 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:09.356 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:09.356 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:09.356 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:09.356 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:09.356 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:09.615 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:09.615 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:09.615 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:09.875 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:09.875 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:09.875 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:10.135 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:10.135 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:10.135 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:10.394 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:08:14.590 Cleaning 00:08:14.590 Removing: /dev/shm/spdk_tgt_trace.pid3155039 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3152553 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3153827 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3155039 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3155835 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3156160 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3156497 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3156842 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3157179 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3157464 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3157752 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3158069 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3158933 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3162142 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3162438 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3162741 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3162895 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3163330 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3163594 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3164051 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3164172 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3164476 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3164741 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3164831 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3165056 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3165593 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3165748 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3166007 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3166335 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3166642 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3166661 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3166729 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3166995 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3167278 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3167545 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3167834 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3168101 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3168376 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3168546 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3168751 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3168969 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3169256 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3169529 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3169810 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3170084 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3170369 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3170594 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3170816 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3170969 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3171233 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3171499 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3171788 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3172054 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3172342 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3172614 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3172841 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3173003 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3173211 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3173473 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3173757 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3174032 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3174316 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3174584 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3174804 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3174994 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3175204 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3175456 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3175746 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3176012 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3176301 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3176573 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3176857 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3176937 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3177282 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3178042 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3178589 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3179046 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3179890 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3180532 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3180947 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3181368 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3181913 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3182309 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3182750 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3183287 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3183722 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3184124 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3184673 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3185161 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3185506 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3186048 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3186591 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3186996 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3187428 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3187972 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3188467 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3188805 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3189341 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3189813 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3190449 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3190838 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3191375 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3191919 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3192462 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3193011 00:08:14.590 Removing: /var/run/dpdk/spdk_pid3193379 00:08:14.590 Clean 00:08:14.590 killing process with pid 3104653 00:08:18.787 killing process with pid 3104650 00:08:18.787 killing process with pid 3104652 00:08:18.787 killing process with pid 3104651 00:08:18.787 09:31:41 -- common/autotest_common.sh@1446 -- # return 0 00:08:18.787 09:31:41 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:08:18.787 09:31:41 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:18.787 09:31:41 -- common/autotest_common.sh@10 -- # set +x 00:08:18.787 09:31:41 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:08:18.787 09:31:41 -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:18.787 09:31:41 -- common/autotest_common.sh@10 -- # set +x 00:08:18.787 09:31:41 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:08:18.787 09:31:41 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:08:18.787 09:31:41 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:08:18.787 09:31:41 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:08:18.787 09:31:41 -- spdk/autotest.sh@383 -- # hostname 00:08:18.787 09:31:41 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:08:18.787 geninfo: WARNING: invalid characters removed from testname! 00:08:19.726 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:08:19.726 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:08:19.726 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:08:31.933 09:31:52 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:37.207 09:31:59 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:41.400 09:32:03 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:46.675 09:32:08 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:50.990 09:32:13 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:08:55.181 09:32:17 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:00.458 09:32:22 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:00.458 09:32:22 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:00.458 09:32:22 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:00.458 09:32:22 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:00.458 09:32:22 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:00.458 09:32:22 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:00.458 09:32:22 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:00.458 09:32:22 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:00.458 09:32:22 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:00.458 09:32:22 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:00.458 09:32:22 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:00.458 09:32:22 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:00.458 09:32:22 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:00.458 09:32:22 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:00.458 09:32:22 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:00.458 09:32:22 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:00.458 09:32:22 -- scripts/common.sh@343 -- $ case "$op" in 00:09:00.458 09:32:22 -- scripts/common.sh@344 -- $ : 1 00:09:00.458 09:32:22 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:00.458 09:32:22 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:00.458 09:32:22 -- scripts/common.sh@364 -- $ decimal 1 00:09:00.458 09:32:22 -- scripts/common.sh@352 -- $ local d=1 00:09:00.458 09:32:22 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:00.458 09:32:22 -- scripts/common.sh@354 -- $ echo 1 00:09:00.458 09:32:22 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:00.458 09:32:22 -- scripts/common.sh@365 -- $ decimal 2 00:09:00.458 09:32:22 -- scripts/common.sh@352 -- $ local d=2 00:09:00.458 09:32:22 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:00.458 09:32:22 -- scripts/common.sh@354 -- $ echo 2 00:09:00.458 09:32:22 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:00.458 09:32:22 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:00.458 09:32:22 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:00.458 09:32:22 -- scripts/common.sh@367 -- $ return 0 00:09:00.458 09:32:22 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:00.458 09:32:22 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:00.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:00.458 --rc genhtml_branch_coverage=1 00:09:00.458 --rc genhtml_function_coverage=1 00:09:00.458 --rc genhtml_legend=1 00:09:00.458 --rc geninfo_all_blocks=1 00:09:00.458 --rc geninfo_unexecuted_blocks=1 00:09:00.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:00.458 ' 00:09:00.458 09:32:22 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:00.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:00.458 --rc genhtml_branch_coverage=1 00:09:00.458 --rc genhtml_function_coverage=1 00:09:00.458 --rc genhtml_legend=1 00:09:00.458 --rc geninfo_all_blocks=1 00:09:00.458 --rc geninfo_unexecuted_blocks=1 00:09:00.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:00.458 ' 00:09:00.458 09:32:22 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:00.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:00.458 --rc genhtml_branch_coverage=1 00:09:00.458 --rc genhtml_function_coverage=1 00:09:00.458 --rc genhtml_legend=1 00:09:00.458 --rc geninfo_all_blocks=1 00:09:00.458 --rc geninfo_unexecuted_blocks=1 00:09:00.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:00.458 ' 00:09:00.458 09:32:22 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:00.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:00.458 --rc genhtml_branch_coverage=1 00:09:00.458 --rc genhtml_function_coverage=1 00:09:00.458 --rc genhtml_legend=1 00:09:00.458 --rc geninfo_all_blocks=1 00:09:00.458 --rc geninfo_unexecuted_blocks=1 00:09:00.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:00.458 ' 00:09:00.458 09:32:22 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:00.458 09:32:22 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:00.458 09:32:22 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:00.458 09:32:22 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:00.458 09:32:22 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.458 09:32:22 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.458 09:32:22 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.458 09:32:22 -- paths/export.sh@5 -- $ export PATH 00:09:00.458 09:32:22 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:00.458 09:32:22 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:00.458 09:32:22 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:00.458 09:32:22 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1732869142.XXXXXX 00:09:00.458 09:32:22 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1732869142.0UbsSc 00:09:00.458 09:32:22 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:00.458 09:32:22 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:09:00.458 09:32:22 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:00.458 09:32:22 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:00.458 09:32:22 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:00.458 09:32:22 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:00.458 09:32:22 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:00.458 09:32:22 -- common/autotest_common.sh@10 -- $ set +x 00:09:00.458 09:32:22 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:00.458 09:32:22 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:00.458 09:32:22 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:00.458 09:32:22 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:00.458 09:32:22 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:00.458 09:32:22 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:00.458 09:32:22 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:00.458 09:32:22 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:00.458 09:32:22 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:00.458 09:32:22 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:00.458 09:32:22 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:00.458 + [[ -n 3060690 ]] 00:09:00.458 + sudo kill 3060690 00:09:00.468 [Pipeline] } 00:09:00.484 [Pipeline] // stage 00:09:00.490 [Pipeline] } 00:09:00.505 [Pipeline] // timeout 00:09:00.511 [Pipeline] } 00:09:00.533 [Pipeline] // catchError 00:09:00.539 [Pipeline] } 00:09:00.554 [Pipeline] // wrap 00:09:00.561 [Pipeline] } 00:09:00.577 [Pipeline] // catchError 00:09:00.588 [Pipeline] stage 00:09:00.591 [Pipeline] { (Epilogue) 00:09:00.607 [Pipeline] catchError 00:09:00.609 [Pipeline] { 00:09:00.625 [Pipeline] echo 00:09:00.627 Cleanup processes 00:09:00.635 [Pipeline] sh 00:09:00.925 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:00.926 3203088 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:00.941 [Pipeline] sh 00:09:01.227 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:01.227 ++ awk '{print $1}' 00:09:01.227 ++ grep -v 'sudo pgrep' 00:09:01.227 + sudo kill -9 00:09:01.227 + true 00:09:01.239 [Pipeline] sh 00:09:01.524 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:01.524 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:01.524 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:02.904 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:12.896 [Pipeline] sh 00:09:13.181 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:13.181 Artifacts sizes are good 00:09:13.197 [Pipeline] archiveArtifacts 00:09:13.207 Archiving artifacts 00:09:13.340 [Pipeline] sh 00:09:13.631 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:13.646 [Pipeline] cleanWs 00:09:13.655 [WS-CLEANUP] Deleting project workspace... 00:09:13.655 [WS-CLEANUP] Deferred wipeout is used... 00:09:13.661 [WS-CLEANUP] done 00:09:13.663 [Pipeline] } 00:09:13.681 [Pipeline] // catchError 00:09:13.694 [Pipeline] sh 00:09:14.050 + logger -p user.info -t JENKINS-CI 00:09:14.059 [Pipeline] } 00:09:14.072 [Pipeline] // stage 00:09:14.078 [Pipeline] } 00:09:14.092 [Pipeline] // node 00:09:14.097 [Pipeline] End of Pipeline 00:09:14.238 Finished: SUCCESS